Reese1220
08-14-2011, 08:02 AM
Hello. I am trying to integrate an ASL Eye tracker into our current VR system, with the goal that we will be able to identify specific objects in the rendered 3D world that the user is looking at. The problem lies in mapping the 2D coordinates of the eye tracker output to the 3D Vizard world. I noticed the 'pick' command that returns the object that the mouse cursor is over, but is there a way to control the cursor with the eye tracker output coordinates? Or is there an easier/more effective way to handle this problem? Thanks.