#1
|
|||
|
|||
calibrating dk2 in vizard
Hi all,
I'm using oculus rift dk2 with runtime 0.8, vizard 5.3. In the oculus home platform, headset movements are aligned with the same movements as the players move, does mean that right is right, up is up and so on. When i do this movements after running a vizard script on model which i built in sketchup and using vizconnect to connect the headset movement to the environment (i'm also using xbox which is also connected to the environment)the movements aren't translated in a direct manner. Sometimes participants need to move the head down in order togo up and so on. Can u help me resolving this issue? Thanks in advence. |
#2
|
|||
|
|||
Hi all,
I tried also to do "mountain offset" in the vizconnect tool which works in the piaza model attached to vizconnect but doesn't work properly in my environment. Is there another way to calibrate it such that the movement in the glasses will be in a direct alignment with the enviroment? Thanks. |
#3
|
|||
|
|||
Are you using the Oculus by itself or in combination with another tracking system?
|
#4
|
|||
|
|||
Hi Jeff,
I'm using both oculus and xbox controller for tracking. Is there another way for calibration when both are used for tracking? Thanks. |
#5
|
|||
|
|||
Participants are using both oculus and xbox controller to change orientation
|
#6
|
|||
|
|||
I want to correct myself, i using oculus home.
|
#7
|
|||
|
|||
Hi Jeff,
Sorry for all the comments. I rechecked my file in the vizconnect with the piaza model and i could notice that the head movement are translated in a direct manner just for one point (the calibration point/mountain offset) but when i turn with the xbox left stick the head movement are not translated in a direct manner, in a matter of fact they have a wired movements. If i can program in the software or define in vizconnect that the movements of the headset must be translated as the same manner as my calibration/fixation/moutain off i guess that will solve my problem. Thanks. |
#8
|
|||
|
|||
Try using the VizMove Seated VR preset configuration in vizconnect. It adds the Oculus for head tracking and a transport controlled by Xbox input signals. In the settings of the main transport there is an option called 'orientation tracker' which movements are relative to. By default the orientation tracker is set to the Oculus head tracker. It sounds like this may work the way you want without any modifications or offsets.
|
#9
|
|||
|
|||
Hi Jeff,
it seems to work the way you suggested but i now have a different problem because in order to move forward, backward, left and right i must use the keyboard but i need the xbox controller buttons to do so (Y,A,X,B) since participants need to navigate and can't see the keyboard. If i'm trying to change the mapping such that the controller buttons will be used for walking, it's not moving and steel refers to the keyboard ( as the way to walk. I also want to separate between walking (controller buttons) and orientation (turns with the left stick) because i want from the participants to just turn around ( without the possibility to walk) when getting to a target zone in order to learn this place and return to it in the next try. Thanks. |
#10
|
|||
|
|||
Once applying the VizMove SeatedVR option in the latest Vizard the transport will be configured for the Xbox. I think the issue is that you are running Vizard 5.3 and the preset configuration is different. To use the Xbox controller with the transport you must first add the Xbox controller in the inputs tab. Then you'll be able to map Xbox input signals to the transport movements. Another option is to update to the latest Vizard. In this case you'll either have to install the latest Oculus software or continue using 0.8 and import the Oculus module as in the line below:
Code:
import oculus_08 as oculus |
#11
|
|||
|
|||
Hi Jeff,
I upgraded to the latest version of vizard (5.7) but i steel have the same problem. I can't change the default setting since i receive an error : Traceback (most recent call last): File "C:\Program Files\WorldViz\Vizard5\python\vizact.py", line 3013, in __onevent ret = e.call(val[1]) File "C:\Program Files\WorldViz\Vizard5\python\vizact.py", line 2948, in _callStatic return func(arg,*args,**kwargs) File "C:\Program Files\WorldViz\Vizard5\python\vizconnect\interface .py", line 1286, in <lambda> onMessage('Vizconnect.Node.commitChanges', lambda e: self.commitChanges(e.data.classification)) File "C:\Program Files\WorldViz\Vizard5\python\vizconnect\interface .py", line 491, in commitChanges self._getEditor(classification).commit() File "C:\Program Files\WorldViz\Vizard5\python\vizconnect\edit.py", line 287, in commit self._removeLive() File "C:\Program Files\WorldViz\Vizard5\python\vizconnect\edit.py", line 809, in _removeLive self._updateNodeCode(name) File "C:\Program Files\WorldViz\Vizard5\python\vizconnect\edit.py", line 1115, in _updateNodeCode code = self._parser.generateNode(name, data) File "C:\Program Files\WorldViz\Vizard5\python\vizconnect\code.py", line 1725, in generateNode code += _trimdentN(self._generateNodeMappings(data, target='raw'), 1) File "C:\Program Files\WorldViz\Vizard5\python\vizconnect\code.py", line 1172, in _generateNodeMappings code += self._generateNodeMappingsPerFrame(data, target) File "C:\Program Files\WorldViz\Vizard5\python\vizconnect\code.py", line 1228, in _generateNodeMappingsPerFrame mapping.generateMagCode() File "C:\Program Files\WorldViz\Vizard5\python\vizconnect\code.py", line 269, in generateMagCode signalMagCode = templateObject.getMagnitudeCode(signal['inputName'], signal['signalName']) File "C:\Program Files\WorldViz\Vizard5\python\vizconnect\template\ input.py", line 108, in getMagnitudeCode raise ValueError() ValueError I can change the input signal but walking right and left (not turn right or left) relates to the keyboard since it stays as been programmed the first time (like your company developed, in the preset mode). |
#12
|
|||
|
|||
I can check with a developer about these errors. Can you attach the vizconnect file you are testing with?
|
#13
|
|||
|
|||
Hi Jeff,
I changed the input signal directly in the script which was created from the vizconnect so now it works with no errors. I have a different issue to deal with right now, i want to disable movement in some of the buttons/stick connected to a transport but not disable movement of all transport buttons/sticks, when a condition is exists, how can i do so? Thanks. |
#14
|
|||
|
|||
Hi Jeff,
I have some questions: 1. Is it possible to use the "VizMove SeatedVR" preset in the vizconnect with the Oculus Rift CV1 version? I guess it will work in CV1 if it is been working with the Oculus Rift DK2 version but just want to make sure. 2. Is it possible to add another positional tracker (another positional tracker camera of the CV1 version) to the "VizMove SeatedVR" preset in order to get more accurate location? 3. Is it realy makes a difference to use 2 positional trackers (2 camers)? As far as i know two positional trackers will give a more accurate position which will cause less drifting of yaw,pitch and roll. In the other hand, in your seated preset you are taking under consideration the integrated position of the oculus with the position from the input signal (in my case xbox) to allow shift of the view in a direct manner as one's head orientation movement in the Googles. Is there an advantage of two positional trackers in the preset mode? Thanks in advance. |
#15
|
|||
|
|||
1) Yes, the preset should work with the CV1
2) Vizard/Vizconnect gets the tracking data through the Oculus software. Whether one or more cameras is connected in Oculus software, the setup in vizconnect is the same. 3) I'm not sure how added cameras affect the quality of the tracking other than increasing the tracking space. This is a question that's better to research on the oculus forums. |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Realistic Light and Shadows Using Vizard and 3DS Max | jde | Vizard | 4 | 07-13-2012 11:58 AM |
Vizard 4 Beta Testing | farshizzo | Announcements | 0 | 02-01-2011 11:46 AM |
Vizard 4 Beta Testing | farshizzo | Vizard | 0 | 02-01-2011 11:46 AM |
Vizard tech tip: Using the Python Imaging Library (PIL) | Jeff | Vizard | 0 | 03-23-2009 12:13 PM |
Vizard tech tip: Text to Speech | Jeff | Vizard | 1 | 01-15-2009 10:39 PM |