![]() |
|
#1
|
|||
|
|||
We are having similar issues here. We attach markers to the Oculus for tracking the position of the participants inside a large room and use the Oculus gyroscopic data for yaw, pitch and roll of the viewpoint. For now, I just have a function running that checks disparities between the Oculus' yaw information and the one provided by the optical tracking system. If the difference between the two measurements gets too high, a warning is issued to the user.
Also not really ideal. We tried experimenting without the Oculus gyroscopic data at all and only using optical tracking but this gives jerky results. If we try to filter that head movement sensation becomes sluggish quickly. |
#2
|
|||
|
|||
optical tracking filters
For pure optical tracking we experience some "jitter" as you say. I tried two types of filters, a kalman and a moving average with a narrow time window to avoid lag.
Our system is Nexus 2.2.3 with Vizard 5, there are lots of variables within the Nexus system parameters alone that could contribute to how noisy your data is. I've attached a graph of what my filters do to the raw data. I decided the kalman was the best compromise between smooth tracking and minimum lag. The experience isn't too bad but not as nice as the gyros. |
#3
|
|||
|
|||
Thanks for sharing your results. Looks interesting, we might consider implementing a kalman filter to the yaw data. Are you using your own implementation or is the filtering done by the Nexus software package?
|
#4
|
|||
|
|||
Kalman Filter for Euler Angles and Position data
I had to implement my own filter, Nexus does not have options for a real time filter (although we use 7 markers on the HMD, Nexus 2 in kinematic fit mode will use all available markers on the segment to compute the best fit position and orientation, so the more markers the smoother the result). I'll share the basics here, in case anyone is interested. First, I want to give credit where it is due, most of my code is adapted from this example: http://arxiv.org/ftp/arxiv/papers/1204/1204.0375.pdf
My model is basically to assume that at each iteration, the orientation and position has not changed from the last iteration. This is a really simple model and easy to implement and has some nice characteristics (no overshoot or ripple). I tuned the filter's smoothing and time-lag response by altering the model and measurement noise covariance parameters (not sure if this is the legitimate way to tune a Kalman filter). Here is the code: Code:
import viz import oculus import pykalman import numpy as np #this is setup to filter six states, or variables: <X,Y,Z,Yaw,Pitch,Roll> global X #k-1 timestep state estimate initiated as zeros X = np.array([[0],[0],[0],[0],[0],[0]],dtype=np.float) global P #state covariance prediction P = np.diag((0.01,0.01,0.01,0.01,0.01,0.01)) global A #system model matrix A = np.eye(6,dtype=np.float) xshape = X.shape dq = 0.35#noise covariance of euler angles in the model dqq = 0.5#noise covariance of position data in the model global Q #process noise covariance Q = np.array([[dq,0,0,0,0,0],[0,dq,0,0,0,0],[0,0,dq,0,0,0],[0,0,0,dqq,0,0],[0,0,0,0,dqq,0],[0,0,0,0,0,dqq]],dtype=np.float) global B #system input matrix B = np.eye(xshape[0]) du = 0.0001#measurement noise covariance of euler angles duu = 0.0001#noise covariance of position measurements global U #measurement noise covariance U = np.array([[du],[du],[du],[duu],[duu],[duu]]) global Y #measurement array Y = np.array([[0],[0],[0],[0],[0],[0]],dtype=np.float) yshape = Y.shape global H #measurement matrix H = np.eye(yshape[0]) global R #measurement covariance R = np.eye(yshape[0]) def kf_predict(X,P,A,Q,B,U):#predict the current iterations' state X = np.dot(A, X) + np.dot(B, U) P = np.dot(A, np.dot(P, A.T)) + Q return(X,P) def gauss_pdf(X, M, S): mshape = M.shape xshape = X.shape if mshape[1] == 1: DX = X - np.tile(M, xshape[1]) E = 0.5 * np.sum(DX * (np.dot(np.linalg.inv(S), DX)), axis=0) E = E + 0.5 * mshape[0] * np.log(2 * np.pi) + 0.5 * np.log(np.linalg.det(S)) P = np.exp(-E) elif xshape[1] == 1: DX = tile(X, mshape[1])- M E = 0.5 * np.sum(DX * (np.dot(np.linalg.inv(S), DX)), axis=0) E = E + 0.5 * mshape[0] * np.log(2 * np.pi) + 0.5 * np.log(np.linalg.det(S)) P = np.exp(-E) else: DX = X-M E = 0.5 * np.dot(DX.T, np.dot(np.linalg.inv(S), DX)) E = E + 0.5 * mshape[0] * np.log(2 * np.pi) + 0.5 * np.log(np.linalg.det(S)) P = np.exp(-E) return (P[0],E[0]) def kf_update(X,P,Y,H,R):#use the predicted state and current measurement to determine the current iterations' state IM = np.dot(H, X) IS = R + np.dot(H, np.dot(P, H.T)) K = np.dot(P, np.dot(H.T, np.linalg.inv(IS))) X = X + np.dot(K, (Y-IM)) P = P - np.dot(K, np.dot(IS, K.T)) LH = gauss_pdf(Y, IM, IS) return (X,P,K,IM,IS,LH) #program loop running in real time while ~stop: global X global P global A global Q global B global U global Y global H global R #Y = ?? update your measurement array #example, Y.itemset(0,Yaw) #run it through the filter (X,P) = kf_predict(X,P,A,Q,B,U) (X,P,K,IM,IS,LH) = kf_update(X,P,Y,H,R) #now use the filtered states in X to update the display navigationNode.setEuler(-1*float(X[2])*180/math.pi,float(X[0])*180/math.pi,float(X[1])*180/math.pi)#kalman filtered navigationNode.setPosition([-1*float(X[3]),float(X[5]),-1*float(X[4])]) #Nexus is Right handed coordinates, Vizard is Left handed... |
![]() |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Oculus Rift and 360 Panorama with Vizard Lite Edition | mshukun | Vizard | 4 | 11-10-2015 11:17 AM |
Oculus Rift Fullscreen | teklab | Vizard | 6 | 05-21-2014 06:11 AM |
Html and vizlink on Oculus HMD | mshukun | Vizard | 3 | 10-30-2013 09:56 AM |
Oculus HMD's setViewport() doesn't work | jacky | Vizard | 1 | 09-25-2013 02:50 PM |
support for occulus rift | left_leg | Announcements | 5 | 06-14-2013 10:50 AM |