PDA

View Full Version : vizconnect and hand gestures


shashkes
06-24-2016, 03:03 AM
i'm using vizconnect and optotrak sensors worn on the subjects' wrists to create an avatar. I want the avatars hands to be locked in a position where the palms of the hands are facing forwards (sort of like zombies). This isn't one of the built in gestures so I was wandering what is the best way to do this?

I'm doing this for an experiment and we noticed that when subjects have their hands in other positions, like hands flat forward or pointing, they will use an extra degree of freedom of rotating their wrist from side to side to complete the task which I can't measure.

Thanks!

Jeff
06-27-2016, 04:06 AM
Do you want to apply a fixed rotation to the hand relative to the arm and ignore some of the sensor data that would rotate the hand?

shashkes
07-12-2016, 10:30 AM
yes i want a fixed rotation but palms facing forward, i'm not using the sensors for rotation
I want the subjects to mirror this avatar's hands:
http://activearts.org/trees/wp-content/uploads/2016/07/avatar.png