PDA

View Full Version : Use of viztouch for Right/Wrong Responses


Vishav
09-24-2018, 03:11 AM
Hi,

I am new to the use of touch monitor; I want to use it for vizard based application in which user has to use the touchscreen to make choice among multiple available options.

Once the option is chosen by a user (using finger touch) the interface will record it and move to next task.

How this could be possible in vizard code?

Jeff
09-25-2018, 04:39 AM
The viztouch module is undocumented but may work for your application. I'm not sure that it's compatible with the latest versions of Windows. Run the viztouch.py module included with Vizard and that will start up a demo.

Vishav
09-25-2018, 04:55 AM
Yes. I run viztouch.py and it is working. But how it will work for touch-based responses application?

Vishav
09-25-2018, 05:08 AM
The viztouch.py showed the demo of drawing a line using a touch monitor. But my requirement is to show three objects to the user and user will select one. Once the user touches one object; system recognizes it as it do for mouse click and give feedback.

Jeff
09-25-2018, 05:12 AM
The viztouch module includes the following events:

TOUCH_DOWN_EVENT
TOUCH_UP_EVENT
TOUCH_MOVE_EVENT
GESTURE_EVENT

You can set up the program flow using viztask and wait for touch events using the waitEvent (https://docs.worldviz.com/vizard/latest/#commands/viztask/waitEvent.htm) command. Once a touch event happens, you'll have to determine the region (e.g. Yes button, No button) based on the touch point.

Vishav
09-26-2018, 01:17 AM
Your suggestions helped with the problem. But now I stuck in
(i) finding the position of touch event using viztouch.
(ii) And how to separate out my touch event for objects from the rest of the screen?

Jeff
09-26-2018, 05:15 AM
What version of Windows OS are you running?

When the touch event is triggered it will return an event object with the following:

point: a TouchPoint object which contains time and position data
pos: the pixel position

If you're waiting for the event using viztask, the data object returned will contain the event data. See the example code in the waitEvent page to see how to access that information.

You'll need to know the pixel boundaries of your touch objects and then compare the position data against that.

Vishav
09-26-2018, 06:25 AM
I am using Windows 7

I am attaching code below:
ob5=viz.add('D:\PhD\Study-I_Part_I\My_Objects\Bowl\Bowl.dae',pos=[1.25,1.15,-0.26],euler=[0,0,0],scale =(0.1,0.1,0.1))
ob6=viz.add('D:\PhD\Study-I_Part_I\My_Objects\Teapot\Teapot.dae',pos=[1.3,0.935,0.93],euler=[90,0,0],scale =(0.09,0.08,0.09))
d = yield viztask.waitEvent(viztouch.TOUCH_UP_EVENT)
##### what next###########

Vishav
09-26-2018, 10:59 PM
As you said when an event is triggered it will return an event object; in my case it is returning :
(<viz.Event object at 0x00000000045386D8>,)
How to interpret it?

I am attaching a screenshot with it