#1
|
|||
|
|||
Detect Hands of Avatar (set them as target)
Hi
I use the Oculus Rift with a vizconnect configuration where I've added the "Mark" avatar's hands. I want to touch objects and when they are touched they should change state (e.g. color). My objects have sensors and I can achieve the desired change in state when the viewpoint is near the objects. Now I want to extend the target from the viewpoint to the hands of the avatar (so that when the hands are near the objects, they - the objects - change state). I tried something like that: Code:
viewTransform = vizconnect.getDisplay().getNode3d() #viewTransform = vizconnect.getAvatar#.getNode3d() target = vizproximity.Target(viewTransform) manager.addTarget(target) |
#2
|
|||
|
|||
This maybe difficult using the mark avatar hand. Since the hand is only a mesh and not a distinct 3d node it's not possible to get a handle directly to it and set it as a proximity target. You could try using the head and hands avatar instead. Attached is an example.
|
Tags |
avatar, sensor, sensor proximity, vizconnect |
Thread Tools | |
Display Modes | Rate This Thread |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
vizproximity hint - debug 2D areas related to a source only after adding a target! | fordprefect | Vizard | 1 | 12-15-2014 12:37 PM |
Unexpected Avatar lookAt() behavior when using yield statements | chris2307 | Vizard | 2 | 12-17-2013 02:58 AM |
Collision of an avatar with a quad | Frank Verberne | Vizard | 8 | 06-04-2008 09:44 AM |
Looking through the eyes of an avatar | Frank Verberne | Vizard | 2 | 04-01-2008 05:52 AM |
How to make avatar's eyes to blink when speaking | michelcm3 | Vizard | 12 | 01-15-2008 08:48 AM |