#1
|
|||
|
|||
Rigid body streaming via VRPN from PhaseSpace to Vizard
Hi all,
I was trying to stream the position/orientation data of a rigid body captured in PhaseSpace system in Vizard, but got stuck. I was wondering if Recap2 can stream rigid body data and how? Or should I write a VRPN interface to stream the data myself? Any advice would be hugely appreciated. |
#2
|
|||
|
|||
PhaseSpace supports sending marker and rigid body data via VRPN. You should contact their support to find out what the naming convention is for the trackers and how to enable VRPN streaming. Use Vizard's VRPN plug-in to receive the data.
|
#3
|
|||
|
|||
I've got a problem with the scaling of the incoming data. In the default vrpn.cfg the scale is set to 0.001. But that doesn't seem to be right as my virtual object doesn't correctly overlap with the real object that gets tracked. Depending on the scale the virtual object moves faster or slower and diverges from the real object's position the farther I go from the origin.
How do I get the correct scale factor? Tomorrow I'll try with the phasespace plugin for vizard in parallel to the vrpn server in hopes of finding some kind of measurement, but any help is much appreciated. If anyone else is looking for information, I will share what I found so far. First, get the phasespace VRPN server from here: http://phasespace.com/vrpn/ Note that the versions in the X2E subfolder don't seem to support the stylus. You need to configure the server, so after extracting the contents of the archive, open the contained vrpn.cfg. It's a long file, so search for "phasespace" and you'll find help on the settings. You may delete everything you don't need. So your config may look like this: Code:
vrpn_Tracker_PhaseSpace Tracker0 <owl> # e.g., connect to Recap2's OWL Live Emulator on the same machine # else type the IP adress of the server device="localhost:1" frequency=480 # slave mode on/off slave=0 drop_frames=1 scale=0.001 # If slave = 1, any sensor declarations below will be ignored # Make a rigid body for the calibration wand # You may look into the wand.rb file for the coordinates of the LEDs relative to the local pivot sensor=0 type=rigid_body tracker=0 sensor=1 type=point tracker=0 led=4 x=0 y=920 z=-7.615 sensor=2 type=point tracker=0 led=5 x=-7.615 y=795 z=0 sensor=3 type=point tracker=0 led=6 x=0 y=670 z=-7.615 sensor=4 type=point tracker=0 led=7 x=-7.615 y=545 z=0 sensor=5 type=point tracker=0 led=8 x=0 y=420 z=-7.615 sensor=6 type=point tracker=0 led=9 x=-7.615 y=295 z=0 sensor=7 type=point tracker=0 led=10 x=0 y=170 z=-7.615 sensor=8 type=point tracker=0 led=11 x=-7.615 y=45 z=0 # Stylus Point Tracker sensor=9 type=stylus stylus_id=1 # map sensor 9 to stylus 1 Code:
vrpn = viz.add('vrpn7.dle') wandTracker = vrpn.addTracker('Tracker0@localhost',0) # the 0 is the sensor id Code:
rigidTracker = vrpn.addTracker('Tracker0@localhost',5) |
#4
|
|||
|
|||
Hi Vaquero,
I'm also a Phasespace and Vizard user. I'm happy to chat sometime if you would like. I haven't seen the issues you're seeing with scaling factors, but your'e right that Phasespace's system is in mm and Vizard's in meters. I have a script that I'm willing to share that you may find helpful. It does not work through VRPN, but through Phasespace's own Python scripts. http://cis.rit.edu/performlab/contact - gD |
Thread Tools | |
Display Modes | Rate This Thread |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Vizard and Phasespace Time Data | teklab | Vizard | 1 | 06-13-2013 10:43 AM |
VRPN problems ( Vizard 3.0 ) | andrewjworz | Vizard | 1 | 07-13-2012 09:04 AM |
Vizard 4 Beta Testing | farshizzo | Vizard | 0 | 02-01-2011 10:46 AM |
Vizard tech tip: Using the Python Imaging Library (PIL) | Jeff | Vizard | 0 | 03-23-2009 11:13 AM |
Rigid body articulated human body models | rfam | Vizard | 12 | 01-02-2009 04:00 PM |