#1
|
|||
|
|||
Integrating ASL Eye Tracker
Hello. I am trying to integrate an ASL Eye tracker into our current VR system, with the goal that we will be able to identify specific objects in the rendered 3D world that the user is looking at. The problem lies in mapping the 2D coordinates of the eye tracker output to the 3D Vizard world. I noticed the 'pick' command that returns the object that the mouse cursor is over, but is there a way to control the cursor with the eye tracker output coordinates? Or is there an easier/more effective way to handle this problem? Thanks.
|
#2
|
|||
|
|||
The pick command allows you to pass an optional position argument representing the normalized (0-1) screen coordinates to perform the picking at. If not specified, it defaults to the current mouse position. For example, if you wanted to perform the pick operation in the middle of the screen, you would use the following code:
Code:
node = viz.MainWindow.pick(pos=(0.5,0.5)) |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Any way of knowing if a tracker is lost? | shivanangel | Precision Position Tracker (PPT) | 3 | 03-19-2014 06:41 PM |
Intersense wand and Head tracker | blessonisaac | Vizard | 0 | 01-10-2011 03:39 PM |
The problem of tracker using via VRPN | _kj_ | Vizard | 2 | 08-13-2009 12:03 AM |
Simple Head Tracker | PHart | Vizard | 1 | 05-10-2009 05:35 PM |
Interface with Tobii Eye Tracker | Uttama_vizard | Precision Position Tracker (PPT) | 1 | 12-12-2007 11:31 AM |