|
#1
|
|||
|
|||
ImmersaDesk system tracking
We use an ImmersaDesk R2 VR system. The system has two computers. One does mostly just the graphics, while the other does the tracking.
The tracking computer has a Flock of Birds card and runs a trackd server, provided to us by VRCO (http://www.vrco.com). The graphics computer runs a trackd client which receives tracking data from the tracking computer via an RS232 connection. We are not sure exactly what information is being sent, but we do know that trackd can be used for more that just Flock of Birds. We would prefer not to alter the set up, as we will still need to use this system for vGeo, but we are very interested in Vizard. Would anyone out there have any insight into how we can use this tracking setup with Vizard? |
#2
|
|||
|
|||
I found a trackdAPI that allows for the trackd output to be shared. I'll need to look into that. I'm still not sure how I'm going to get from the API to Vizard
|
#3
|
|||
|
|||
Hi,
If you have the API then you can create a Vizard sensor plugin that will read the data and pass it to your script. You can download the Vizard SDK which includes a sample sensor plugin. Or you can send the trackdAPI to support@worldviz.com and we can create one for you. |
#4
|
|||
|
|||
Thanks, I requested a copy of the API. Once it comes in, I'll send it your way. Thanks again.
|
#5
|
|||
|
|||
We have finished and tested to the Vizard trackD plugin as attached to this post. Along with the .dls file there is attached a sample python script which should illustrate how the plug-in works. Please try this out and let us know if you have any problems or suggestions for improvement.
|
#6
|
|||
|
|||
The plugin works.
However, at times, either or both of the markers will snap back to the middle of the screen, pointing to the right. I don't know if it is the software or our sensors. I was also wondering if one of the sensors could be easily set up to do head tracking. |
#7
|
|||
|
|||
Hi,
I'm not sure what is causing that. I've made one small change that might fix that, it's attached below. If it doesn't fix your problem, then it is either the trackd software or the sensor. To link a sensor to the viewpoint do the following: Code:
viz.get(viz.MAIN_VIEWPOINT).link(sensor) |
#8
|
|||
|
|||
Grabbing the information form the tracker, I got 7 numbers. I was only expecting 6. What do these values correspond to?
|
#9
|
|||
|
|||
Hi,
The first 3 values are the position and the remaining 4 numbers are the quaternion rotation. |
#10
|
|||
|
|||
I am not familiar with quaternions. How do I get the angle the sensor is held at from that data?
What I am trying to do is draw a line, using the on the fly functions, that starts at my location and extends out based on the angle of the sensor in my Wanda to be used like a laser pointer. Last edited by kgarr; 09-14-2006 at 10:54 AM. |
#11
|
|||
|
|||
Is the length of the line fixed?
Either way, just translate and rotate the line to the position/rotation of the sensor. Code:
#Create line of length 1 viz.startlayer(viz.LINES) viz.vertex(0,0,0) viz.vertex(0,0,1) line = viz.endlayer() line.dynamic() . . . #Update line based on sensor data data = sensor.get() line.translate(data[:3]) line.rotatequat(data[3:7]) #If line length is dynamic, then update it line.vertex(1,0,0,length) |
Thread Tools | |
Display Modes | Rate This Thread |
|
|