#1
|
|||
|
|||
Proximity Sensing Using Kinect
Hello,
I am using a Kinect with the FAAST program in order to create a 3d spherical representation of a person. My goal is to create a way for that person's hand to interact with an object that has been created. I want to be able to move that object from point A to point B. Below is the code that I have written so far. I am having trouble getting the sensors to connect with the hand spheres, and I also don't know how to create a movable object in Vizard. I am new to this and am using this as part of an experiment: import viz import vizshape import math import random import time import vizact import vizproximity viz.go() grid = vizshape.addGrid() """ Kinect Tracker object ID's These are not actually being used in the script but are to help anyone who wants to get access to a specific bodypart. For example to just get a handle to tracking data for the head use: myHead = vrpn.addTracker( 'Tracker0@localhost', HEAD). """ HEAD = 0 NECK = 1 TORSO = 2 WAIST = 3 LEFTCOLLOR = 4 LEFTSHOULDER = 5 LEFTELBOW = 6 LEFTWRIST = 7 LEFTHAND = 8 LEFTFINGERTIP = 9 RIGHTCOLLAR = 10 RIGHTSHOULDER = 11 RIGHTELBOW = 12 RIGHTWRIST = 13 RIGHTHAND = 14 RIGHTFINGERTIP = 15 LEFTHIP = 16 LEFTKNEE = 17 LEFTANGLE = 18 LEFTFOOT = 19 RIGHTHIP = 20 RIGHTKNEE = 21 RIGHTANKLE = 22 RIGHTFOOT = 23 #store trackers, links, and vizshape objects trackers = [] links = [] shapes = [] #start vrpn vrpn = viz.addExtension('vrpn7.dle') #now add all trackers and link a shape to it for i in range(0, 24): t = vrpn.addTracker( 'Tracker0@localhost',i ) s = vizshape.addSphere(radius=.1) l = viz.link(t,s) trackers.append(t) links.append(l) shapes.append(s) ################################################## ################################################## ########################## ### Proximity Section ### viz.phys.enable() manager = vizproximity.Manager() manager.setDebug(viz.ON) #Add main viewpoint as proximity target target = vizproximity.Target(viz.MainView) manager.addTarget(target) #Creating Right Hand Sensor rightHandSphere = vizshape.addSphere(radius = 0.08) rightHandSphere.color( viz.BLUE ) rightHandLink = viz.link( shapes[14] , rightHandSphere ) #links the proximity with the sphere #Creating Left Hand Sensor leftHandSphere = vizshape.addSphere(radius = 0.08) leftHandSphere.color( viz.BLUE ) rightHandLink = viz.link( shapes[14] , rightHandSphere ) #links the proximity with the sphere |
#2
|
|||
|
|||
You could use vizconnect to add your trackers and link a grabber tool to the hand. The grabber tool makes it possible to grab and move objects in the scene. It would be best to go through all the vizconnect tutorials as each one builds on the next. The last one, avatars and tools, covers this topic.
|
#3
|
|||
|
|||
Creating Gestures
Hello,
I was able to use vizconnect to create the head tracker that tracks the users movement while using the XBox Kinect. I added right and left arm trackers, but I am currently having trouble getting them to appear on the screen where it would be possible to see them. I was also trying to add a grabber, but I don't know how to create gestures that vizconnect would recognize and allow me to apply. |
#4
|
|||
|
|||
What kind of avatar did you add in the avatar's tab of vizconnect? Did you assign the hand trackers to the avatar in the animator section?
|
#5
|
|||
|
|||
Regarding Kinect & Vizard
Hi jeff,
The same issues i m getting to work on Kinect XBox 360 with the vizard. I m using 1 Single Kinect as tarckers for both head and hand. i have a vizconnect with 2 trackers ,1 for head and 1 for hand to be tracked by a single Kinect camera. I have also assigned those trackers into the Avatar Animator section of Vizconnect. -For head tracker i m using sensor id as '0' -For hand Tracker(Right hand) i m using sensor id as '14' I m getting the data's for both the trackers into vizconnect GUI Status Dialog. I m able to get the kinect work for HEAD TRACKER oriented Properly but i m not able to get the hand tracker Grabber visible in the screen projection and not the orientation for the hand is seen. Pls find the screen shot of the vizconnect file and vizconnect screen shot for reference. Thanx & Regards: Mr.Rajnish Vishwakarma VR Developer at Xenium Digital Pvt Ltd(Customer of World Viz),Mumbai Waiting for ur reply Jeff!!! |
#6
|
|||
|
|||
The y values of both trackers looks low. The hand tracker shows a negative y value. For each tracker in vizconnect, press the offsets button and add a 'post trans y' offset to account for the kinect sensor's height off the ground.
|
#7
|
|||
|
|||
Regarding Kinect..
Thanx for ur information Jeff!!
|
Tags |
kinect, proximity sensor |
Thread Tools | |
Display Modes | Rate This Thread |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Orientation selective (Viewing Vector) Proximity Sensor | lmGehrke | Vizard | 2 | 03-17-2015 01:31 PM |
Problems with using WIRKS without the Kinect in a mirror setting | Jennifer | Vizard | 0 | 06-03-2013 04:17 PM |
kinect + avatars | Darkmax | Vizard | 7 | 04-19-2013 05:08 PM |
Phase Space and Proximity Sensors | snovob93 | Vizard | 3 | 06-13-2012 01:32 PM |
Kinect auto selection of skeleton | victorqx | Vizard | 0 | 05-29-2012 09:30 AM |