![]() |
|
![]() |
|
Thread Tools | Rate Thread | Display Modes |
|
#1
|
|||
|
|||
In your code, two coordinate swaps are applied to the position data. One to the headtracker and one to the viewlink. Do all virtual movements (left,right,up,down) match the physical ones? If so, then that's probably fine but it should only be necessary to apply coordinate swaps once.
You may need to account for a unit conversion between Phasespace and Vizard but applying an additional scale factor should not be necessary. I would recommend testing with a simple script, just add an axes model linked to the head tracker and print out the data every frame. Verify that works properly on its own before incorporating the tracking data into this script. Is the Phasespace origin calibrated to the ground plane? To account for the offset between the user's eyes and position tracker mounting, apply a preTrans operation to the viewlink. The values you are using indicate the position tracker is mounted in front of the user's eyes. Is that the case? Code:
viewLink.preTrans([0,-0.08,-0.05]) |
#2
|
|||
|
|||
Thanks for pointing out the double swap, Jeff! I overlooked that. Positional and querternion swaps are all set to work correctly.
It turns out that most of the issues I see are related to the FOV. Culprit are these lines: Code:
import nvis nvis.nvisorST() So setting the FOV to these settings makes things much better, but not perfect. There's still quite a lot of rotation around the Y-axis when turning the head, but much less up and down movement. It fixes the scale of the perceived world, so the scale doesn't seem to be an issue anymore. The ST50 has a vertical FOV of 32° as I mentioned, but a horizontal FOV of 40°, the diagonal (binocular) FOV is 50°. So I changed some lines to this: Code:
viz.fov(32, 1.25) viz.window.setFullscreenMonitor([4,3]) viz.go(viz.FULLSCREEN | viz.STEREO_HORZ | viz.HMD) I also turned up the sensitivity of the InertaCube4 in hopes to reduce latency. Are there recommended settings for use in VR with Vizard? The LED that's tracking the HMD is 5cm in front of the eyes and 8cm up (on the top of the HMD). Would it be better to put it somewhere else? Or should I put 2 LEDs on each side of the interia cube and see if I can create this optical heading I read about in vizconnect? Last edited by Vaquero; 04-12-2016 at 08:18 AM. |
#3
|
|||
|
|||
The actual setting that reduced some of the movement of the world is this:
Code:
isense = viz.add('intersense.dle') headOriTracker = isense.addTracker() headOriTracker.setPrediction(50) |
![]() |
Tags |
augmented reality, drift, fov, intersense, tracking |
Thread Tools | |
Display Modes | Rate This Thread |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Making Virtual World Using Video Recording | Hitesh | Vizard | 2 | 06-22-2015 09:03 AM |
Showing another program in a virtual world | Frank Verberne | Vizard | 3 | 01-16-2013 10:26 AM |
why is time faster in my virtual world? | billjarrold | Vizard | 1 | 11-24-2009 05:33 PM |