PDA

View Full Version : Proximity Sensing Using Kinect


JamesCakes
03-31-2015, 10:54 AM
Hello,

I am using a Kinect with the FAAST program in order to create a 3d spherical representation of a person. My goal is to create a way for that person's hand to interact with an object that has been created. I want to be able to move that object from point A to point B. Below is the code that I have written so far. I am having trouble getting the sensors to connect with the hand spheres, and I also don't know how to create a movable object in Vizard. I am new to this and am using this as part of an experiment:

import viz
import vizshape
import math
import random
import time
import vizact
import vizproximity

viz.go()
grid = vizshape.addGrid()

"""
Kinect Tracker object ID's
These are not actually being used in the script but are to
help anyone who wants to get access to a specific bodypart.
For example to just get a handle to tracking data for the head use:
myHead = vrpn.addTracker( 'Tracker0@localhost', HEAD).
"""
HEAD = 0
NECK = 1
TORSO = 2
WAIST = 3
LEFTCOLLOR = 4
LEFTSHOULDER = 5
LEFTELBOW = 6
LEFTWRIST = 7
LEFTHAND = 8
LEFTFINGERTIP = 9
RIGHTCOLLAR = 10
RIGHTSHOULDER = 11
RIGHTELBOW = 12
RIGHTWRIST = 13
RIGHTHAND = 14
RIGHTFINGERTIP = 15
LEFTHIP = 16
LEFTKNEE = 17
LEFTANGLE = 18
LEFTFOOT = 19
RIGHTHIP = 20
RIGHTKNEE = 21
RIGHTANKLE = 22
RIGHTFOOT = 23

#store trackers, links, and vizshape objects
trackers = []
links = []
shapes = []

#start vrpn
vrpn = viz.addExtension('vrpn7.dle')

#now add all trackers and link a shape to it
for i in range(0, 24):
t = vrpn.addTracker( 'Tracker0@localhost',i )
s = vizshape.addSphere(radius=.1)
l = viz.link(t,s)
trackers.append(t)
links.append(l)
shapes.append(s)

################################################## ################################################## ##########################
### Proximity Section ###

viz.phys.enable()

manager = vizproximity.Manager()
manager.setDebug(viz.ON)

#Add main viewpoint as proximity target
target = vizproximity.Target(viz.MainView)
manager.addTarget(target)

#Creating Right Hand Sensor
rightHandSphere = vizshape.addSphere(radius = 0.08)
rightHandSphere.color( viz.BLUE )
rightHandLink = viz.link( shapes[14] , rightHandSphere ) #links the proximity with the sphere

#Creating Left Hand Sensor
leftHandSphere = vizshape.addSphere(radius = 0.08)
leftHandSphere.color( viz.BLUE )
rightHandLink = viz.link( shapes[14] , rightHandSphere ) #links the proximity with the sphere

Jeff
04-01-2015, 11:58 PM
You could use vizconnect to add your trackers and link a grabber tool to the hand. The grabber tool makes it possible to grab and move objects in the scene. It would be best to go through all the vizconnect tutorials as each one builds on the next. The last one, avatars and tools (http://docs.worldviz.com/vizard/#vizconnect_tutorial_tool_avatar.htm%3FTocPath%3DT utorials%20%26%20Examples%7CVizconnect%7CAvatars%2 0and%20Tools%7C_____1), covers this topic.

JamesCakes
04-14-2015, 10:16 AM
Hello,

I was able to use vizconnect to create the head tracker that tracks the users movement while using the XBox Kinect. I added right and left arm trackers, but I am currently having trouble getting them to appear on the screen where it would be possible to see them. I was also trying to add a grabber, but I don't know how to create gestures that vizconnect would recognize and allow me to apply.

Jeff
04-15-2015, 01:56 PM
What kind of avatar did you add in the avatar's tab of vizconnect? Did you assign the hand trackers to the avatar in the animator section?

rajnishv
06-22-2016, 11:42 PM
Hi jeff,
The same issues i m getting to work on Kinect XBox 360 with the vizard.
I m using 1 Single Kinect as tarckers for both head and hand.
i have a vizconnect with 2 trackers ,1 for head and 1 for hand to be tracked by a single Kinect camera.
I have also assigned those trackers into the Avatar Animator section of Vizconnect.
-For head tracker i m using sensor id as '0'
-For hand Tracker(Right hand) i m using sensor id as '14'

I m getting the data's for both the trackers into vizconnect GUI Status Dialog.

I m able to get the kinect work for HEAD TRACKER oriented Properly but i m not able to get the hand tracker Grabber visible in the screen projection and not the orientation for the hand is seen.

Pls find the screen shot of the vizconnect file and vizconnect screen shot for reference.

Thanx & Regards:

Mr.Rajnish Vishwakarma
VR Developer at Xenium Digital Pvt Ltd(Customer of World Viz),Mumbai

Waiting for ur reply Jeff!!!

Jeff
06-23-2016, 03:31 AM
The y values of both trackers looks low. The hand tracker shows a negative y value. For each tracker in vizconnect, press the offsets button and add a 'post trans y' offset to account for the kinect sensor's height off the ground.

rajnishv
06-24-2016, 05:52 AM
Thanx for ur information Jeff!!