View Single Post
  #10  
Old 02-11-2013, 02:13 PM
ckharrison ckharrison is offline
Member
 
Join Date: Jan 2013
Location: Muncie, Indiana, USA
Posts: 5
Hi mspusch,

Thanks so much for your quick response. (I'm sorry I didn't subscribe to the thread, so I didn't see that you responded until I just checked back in, but that's all fixed now)

Just to give you a bit of background on where I'm coming from with my question since I was very vague in my original post in the interest of being brief: our lab actually already possesses the Cadillac setup, complete with the SX111 and PPT Studio, which we've had for just over a year now, so I'm pretty familiar with all the nice high-end components. My primary conern right now is in the interest of developing as low-cost of a solution as possible (which could potentially be duplicated several times over) and distributed to our friends, and then all networked together to create a shared virtual reality environment.

Last summer I think our lab actually did a quick test drive of the InertiaLabs sensor, which seemed to do quite a nice job with orientation, though we were comparing to the IC2 on our SX111, to which of course it is a slight step down. while it is a great device, I still wanted to try something even cheaper and more accesible, and came across this TrackIR system, which so far seems to be doing a remarkable job, especially given it's price point. Simply within the TrackIR software, I've run the tracker in first person mode while wearing the Sony HMD, and the responsiveness is nearly instantaneous and exceptionally smooth.

I've read up on the WIRKS package and watched all the demo videos, and it does seem to be a nice all-encompassing system, but for the sake of our research, we'd like to see if we could do something even more cost effective than that, and since the orientation tracker seems to be the most expensive component of the WIRKS system, that's what we'd like to slash first.

I've already managed to get the Kinect and the Sony HMD pulled in and coopoerating within Vizard, but as I'm sure you guys know, the Kinect doesn't do well at all with mapping the orientation of the head node, hence the need for the InertiaLabs sensor or the TrackIR system. And so, having these two pieces in place and working, I want to mute or override the orientation data (at least for the head node) coming from the Kinect and substitute in it's place the orientation from the TrackIR via it's mouse emulator mode.

The TrackIR is able to output full 6DOF tracking info, but of course I'd cut that in half and only take the orientation data and stack that onto what I'm getting from the Kinect.

Back to the issue of simply getting the data from TrackIR into a form that will play nicely with Vizard: I mentioned that they do have a mouse emulator, but the trouble with that is that I'll only get two degrees of freedom translated through, with being X-axis movement of the mouse being Yaw and Y-axis being Pitch, and will be missing out on the Z-axis Roll data.

I understand that TrackIR still hasn't joined the VRPN bandwagon yet with their sensor, otherwise this whole issue would almost be a no-brainer, so I'm just really trying to see what potential solutions there are for getting TrackIR to give me Yaw, Pitch, and Roll, in a method which Vizard will understand.

Thanks very much for your help!
Reply With Quote