#1
|
|||
|
|||
vizconnect and hand gestures
i'm using vizconnect and optotrak sensors worn on the subjects' wrists to create an avatar. I want the avatars hands to be locked in a position where the palms of the hands are facing forwards (sort of like zombies). This isn't one of the built in gestures so I was wandering what is the best way to do this?
I'm doing this for an experiment and we noticed that when subjects have their hands in other positions, like hands flat forward or pointing, they will use an extra degree of freedom of rotating their wrist from side to side to complete the task which I can't measure. Thanks! |
#2
|
|||
|
|||
Do you want to apply a fixed rotation to the hand relative to the arm and ignore some of the sensor data that would rotate the hand?
|
#3
|
|||
|
|||
yes i want a fixed rotation but palms facing forward, i'm not using the sensors for rotation
I want the subjects to mirror this avatar's hands: |
Thread Tools | |
Display Modes | Rate This Thread |
|
|