WorldViz User Forum

WorldViz User Forum (https://forum.worldviz.com/index.php)
-   Vizard (https://forum.worldviz.com/forumdisplay.php?f=17)
-   -   Oculus Rift Drift (https://forum.worldviz.com/showthread.php?t=4912)

teklab 01-13-2014 09:25 AM

Oculus Rift Drift
 
We are running an Oculus Rift with our Vizard Wirks package for an experiment that involves a subject looking down and to the right for extended periods of time while virtual targets are presented. I calibrate the Rift before every test run, and have tried running it with the magnetic yaw correction in the Oculus calibration settings both on and off, but I cannot eliminate a problem with yaw drift. For example, when the script starts, looking straight is "straight" in the virtual world. However, after a few minutes of looking down and to the right, looking straight is no longer "straight" in the world, but more skewed toward a corner. I have also tried turning off the sensor.setPrediction setting within Vizard. Resetting the sensor with the subject looking straight ahead fixes the problem briefly, but the drift persists afterwards.

Any tips for fixing this drift?

willpower2727 03-03-2016 06:10 AM

I've also experienced an issue with the DK2 (Vizard 5) sensors lagging, then applying a different offset to the orientation. The effect is nauseating. I'm not sure what the cause is, maybe my loop is going too fast, but it would be nice to find a way to avoid.

Jeff 03-04-2016 09:05 PM

What is the result when you the oculusExample.py script included with Vizard?

willpower2727 03-08-2016 10:09 AM

I tested it just now with the oculusExample.py script. Please note that I am not using the position tracker at the moment, just the hmd's gyro.

I did not notice a fast yaw drift. If there was a drift it was slow. However what I see is the orientation often freezes, then when tracking resumes the orientation is not what it should be.

Using the vizconnect tool I could observe a yaw drift as well as the "freezing" of the gyroscope outputs. I wonder if the lack of the position camera has something to do with this?

willpower2727 03-08-2016 11:06 AM

I've posted a video of what this looks like. Hopefully this helps explain what is happening.

https://www.youtube.com/watch?v=cLwD...ature=youtu.be

Jeff 03-08-2016 11:57 PM

If Vizard is running at a stable 75 fps then it maybe a hardware or driver issue if it was working well before. If you have not, try updating to runtime 0.8, Vizard 5.3, and latest GPU drivers. Also, make sure you meet the recommended Oculus hardware specs.

willpower2727 03-09-2016 11:44 AM

I've verified the following:

frame rate: 75 fps
Oculus configuration utility: v 1.1
Oculus SDK 0.8.0.0
Graphics card: Nvidia Quadro 4000 (a little older)
Graphics driver: 361.75

Is it sill plausible that the graphics card would be causing a delay in the tracking? Is there something in the inner workings of the vizconnect link that allows the graphics to block gyroscope acquisition?

Jeff 03-09-2016 07:35 PM

Vizconnect would not be blocking any data, it's generating the Vizard code that connects to the tracker. I would expect you'll see the same issue in other Oculus apps, at least those based on OpenGL like Vizard.

willpower2727 03-14-2016 06:16 AM

It turns out the delays were caused by a bad USB cable. Sorry for the undue hassle.

willpower2727 03-15-2016 08:09 AM

1 Attachment(s)
The yaw drift is still present however. Any advice on correcting for it? Should I try and re-calibrate the magnetometer? Apply a stronger magnetic field?

mape2k 03-29-2016 12:52 AM

I would not advice you to apply a strong magnetic field. This could potentially do more harm than good.

Are you using the Oculus together with the camera (which is supposed to go on top of the monitor)? If not, the slow yaw drift is a normal behavior of the Oculus and can not be corrected.

We are having the same issue here, as we are using the Oculus together with a Vicon Tracking System inside a large room. Thus, we cannot use the camera and we get a slow yaw drift (1-4deg/min) whenever the Oculus is not in motion.

willpower2727 03-29-2016 12:01 PM

Our lab is for gait studies, subjects typically walk back and forth on a 9 meter walkway which renders the position tracker that came with the oculus useless. Not being able to get around the yaw-drift, which I agree can't be corrected, I attached a cluster of markers to the hmd and use the marker data to update the orientation of the view in "live mode". It's not as nice as the 1000 Hz gryo, I use a kalman filter to try and reduce the jitters but at least there is no yaw drift. I suppose if I felt outgoing I could continue to use the gryos and use the marker data to correct the yaw once in a while.

Jeff 03-29-2016 10:41 PM

If you have two position trackers fixed to the HMD on the left and right you can use vizconnect's optical heading tracker to correct yaw drift. In vizconnect, first add the two position trackers and orientation tracker. Then add the optical heading tracker and use this to drive the viewpoint

mape2k 03-30-2016 06:40 AM

We are having similar issues here. We attach markers to the Oculus for tracking the position of the participants inside a large room and use the Oculus gyroscopic data for yaw, pitch and roll of the viewpoint. For now, I just have a function running that checks disparities between the Oculus' yaw information and the one provided by the optical tracking system. If the difference between the two measurements gets too high, a warning is issued to the user.

Also not really ideal. We tried experimenting without the Oculus gyroscopic data at all and only using optical tracking but this gives jerky results. If we try to filter that head movement sensation becomes sluggish quickly.

willpower2727 03-30-2016 08:34 AM

optical tracking filters
 
1 Attachment(s)
For pure optical tracking we experience some "jitter" as you say. I tried two types of filters, a kalman and a moving average with a narrow time window to avoid lag.

Our system is Nexus 2.2.3 with Vizard 5, there are lots of variables within the Nexus system parameters alone that could contribute to how noisy your data is. I've attached a graph of what my filters do to the raw data. I decided the kalman was the best compromise between smooth tracking and minimum lag. The experience isn't too bad but not as nice as the gyros.

mape2k 03-31-2016 12:02 AM

Thanks for sharing your results. Looks interesting, we might consider implementing a kalman filter to the yaw data. Are you using your own implementation or is the filtering done by the Nexus software package?

willpower2727 03-31-2016 06:14 AM

Kalman Filter for Euler Angles and Position data
 
I had to implement my own filter, Nexus does not have options for a real time filter (although we use 7 markers on the HMD, Nexus 2 in kinematic fit mode will use all available markers on the segment to compute the best fit position and orientation, so the more markers the smoother the result). I'll share the basics here, in case anyone is interested. First, I want to give credit where it is due, most of my code is adapted from this example: http://arxiv.org/ftp/arxiv/papers/1204/1204.0375.pdf

My model is basically to assume that at each iteration, the orientation and position has not changed from the last iteration. This is a really simple model and easy to implement and has some nice characteristics (no overshoot or ripple). I tuned the filter's smoothing and time-lag response by altering the model and measurement noise covariance parameters (not sure if this is the legitimate way to tune a Kalman filter).

Here is the code:

Code:

import viz
import oculus
import pykalman
import numpy as np

#this is setup to filter six states, or variables:

global X #k-1 timestep state estimate initiated as zeros
X = np.array([[0],[0],[0],[0],[0],[0]],dtype=np.float)

global P #state covariance prediction
P = np.diag((0.01,0.01,0.01,0.01,0.01,0.01))

global A #system model matrix
A = np.eye(6,dtype=np.float)

xshape = X.shape

dq = 0.35#noise covariance of euler angles in the model
dqq = 0.5#noise covariance of position data in the model

global Q #process noise covariance
Q = np.array([[dq,0,0,0,0,0],[0,dq,0,0,0,0],[0,0,dq,0,0,0],[0,0,0,dqq,0,0],[0,0,0,0,dqq,0],[0,0,0,0,0,dqq]],dtype=np.float)

global B #system input matrix
B = np.eye(xshape[0])

du = 0.0001#measurement noise covariance of euler angles
duu = 0.0001#noise covariance of position measurements

global U #measurement noise covariance
U = np.array([[du],[du],[du],[duu],[duu],[duu]])

global Y #measurement array
Y = np.array([[0],[0],[0],[0],[0],[0]],dtype=np.float)
yshape = Y.shape

global H #measurement matrix
H = np.eye(yshape[0])

global R #measurement covariance
R = np.eye(yshape[0])

def kf_predict(X,P,A,Q,B,U):#predict the current iterations' state
        X = np.dot(A, X) + np.dot(B, U)
        P = np.dot(A, np.dot(P, A.T)) + Q
        return(X,P)
       
def gauss_pdf(X, M, S):
        mshape = M.shape
        xshape = X.shape
        if mshape[1] == 1:
                DX = X - np.tile(M, xshape[1])
                E = 0.5 * np.sum(DX * (np.dot(np.linalg.inv(S), DX)), axis=0)
                E = E + 0.5 * mshape[0] * np.log(2 * np.pi) + 0.5 * np.log(np.linalg.det(S))
                P = np.exp(-E)
               
        elif xshape[1] == 1:
                DX = tile(X, mshape[1])- M
                E = 0.5 * np.sum(DX * (np.dot(np.linalg.inv(S), DX)), axis=0)
                E = E + 0.5 * mshape[0] * np.log(2 * np.pi) + 0.5 * np.log(np.linalg.det(S))
                P = np.exp(-E)
        else:
                DX = X-M
                E = 0.5 * np.dot(DX.T, np.dot(np.linalg.inv(S), DX))
                E = E + 0.5 * mshape[0] * np.log(2 * np.pi) + 0.5 * np.log(np.linalg.det(S))
                P = np.exp(-E)
       
        return (P[0],E[0])

def kf_update(X,P,Y,H,R):#use the predicted state and current measurement to determine the current iterations' state
        IM = np.dot(H, X)
        IS = R + np.dot(H, np.dot(P, H.T))
        K = np.dot(P, np.dot(H.T, np.linalg.inv(IS)))
        X = X + np.dot(K, (Y-IM))
        P = P - np.dot(K, np.dot(IS, K.T))
        LH = gauss_pdf(Y, IM, IS)
        return (X,P,K,IM,IS,LH)

#program loop running in real time
while ~stop:

        global X
        global P
        global A
        global Q
        global B
        global U
        global Y
        global H
        global R

        #Y = ?? update your measurement array
        #example, Y.itemset(0,Yaw)

        #run it through the filter
        (X,P) = kf_predict(X,P,A,Q,B,U)
        (X,P,K,IM,IS,LH) = kf_update(X,P,Y,H,R)

        #now use the filtered states in X to update the display
        navigationNode.setEuler(-1*float(X[2])*180/math.pi,float(X[0])*180/math.pi,float(X[1])*180/math.pi)#kalman filtered

        navigationNode.setPosition([-1*float(X[3]),float(X[5]),-1*float(X[4])]) #Nexus is Right handed coordinates, Vizard is Left handed...


willpower2727 03-31-2016 06:30 AM

Jeff's suggestion of using two position trackers and the optical heading in vizconnect might work well, but I can't test it since I have Vicon Nexus 1 and 2, not Tracker. If I had Tracker I'd try it out for sure.

Jeff 03-31-2016 07:31 AM

If you're able to get the Nexus data into Vizard, you should be able to apply that data to vizconnect trackers and use optical heading. First add two group trackers from the trackers tab. Then apply the group trackers to the optical heading tracker. In the postInit() function of the vizconnect file set their raw trackers to be the Nexus trackers:

Code:

# update nodes with nexus data every frame, here they are called nexusLeft and nexusRight

# get handles to group trackers added through trackers tab and apply
# nexus data to them
vizconnect.getTracker('nexus_left').setRaw(nexusLeft)
vizconnect.getTracker('nexus_right').setRaw(nexusRight)


willpower2727 03-31-2016 08:27 AM

I access Nexus data using my own TCP connection with Nexus using Vicon's SDK. As far as I know, there is no supported (by WorldViz) way to get data from Nexus into Vizard. Vicon Tracker is the program that works with vizconnect. If this is no longer true I'd be very excited. But for now I cannot use vizconnect in the way you suggest.

mape2k 04-01-2016 07:57 AM

Thanks for sharing your code, I will give it a try if I have the time for it!

Cheers!


All times are GMT -7. The time now is 08:32 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
Copyright 2002-2023 WorldViz LLC