WorldViz User Forum

WorldViz User Forum (https://forum.worldviz.com/index.php)
-   Plug-in development (https://forum.worldviz.com/forumdisplay.php?f=8)
-   -   HapticMaster + Vizard 4.0 SDK help (https://forum.worldviz.com/showthread.php?t=4215)

pwsnow 04-23-2012 07:07 AM

HapticMaster + Vizard 4.0 SDK help
 
Hi everyone, I’m currently assessing Vizard in order to decide if our team should purchase a license. I have downloaded the demo copy but I have to get an external (unsupported) device working with it before we decide to purchase a copy.

The device I am using for my work is the Moog FCS HapticMaster. Basically it’s a big robot arm that interacts with a virtual world like a mouse but provides haptic feedback. I created a demo app using Ogre (graphics engine) and since that was in C++ I could easily integrate the HapticMaster to get the end effector arm xyz co-ordinates and use that to control a simple model.
I understand that this device isn’t supported officially in Vizard 4.0, but I’m looking to write my own plug in. I’m having difficulty understanding some of the instructions here: http://docs.worldviz.com/vizard/VizH...ementation.htm so I thought I would ask here for some pointers. I’ll explain below how I usually interface with the HapticMaster so that it helps explain the situation better.


When creating the Ogre application to use the HapticMaster end effector xyz position. All I did was include all the HapticMaster API header files and merge parts of one of the HapticMaster examples with the Ogre app. In this case sending/receiving commands to the HapticMaster (via a String command, yeah not my choosing!) and then splitting the String into the xyz values followed by assigning it to the movement of the model. So all the code is in one .cpp file calling the .h file sending command to the HapticMaster. My main problem is getting my head round doing this with the Vizard SDK.


I know that I have to include the HapticMaster.h/lib files etc
I have the code to get the position of the arm in C++ and have saved these as a double (xyz) but might have to put them in Vizards float.
I know I have to make these values available to my .viz python code.
But I can’t find any clear decent guides on how to do this within the Vizard SDK or how I would use the resulting files in a .viz file.

Any help or pointers will be gladly received! Let me know if there is anything I can do to further explain etc

Below is a section of the Ogre code I wrote, hope it is of some use to point me in the right direction to get started! Thanks

Code:

bool TutorialApplication::processUnbufferedInput(const Ogre::FrameEvent& evt)
{
        static bool mMouseDown = true;    // If a mouse button is depressed
    static Ogre::Real mToggle = 0.0;    // The time left until next toggle
    static Ogre::Real mRotate = 0.13;  // The rotate constant
    static Ogre::Real mMove = 150;      // The movement constant set at 250


        Ogre::Vector3 transVector = Ogre::Vector3::ZERO;

        double EEPos[3] = {0.0, 0.0, 0.0};

        double pot1dir;
        double pot2dir;
        double pot3dir;
       

        //transVector.z -= mMove;
        //transVector.z += mMove;

        /*
        // Get The Current EndEffector Position From THe HapticMASTER
        if ( haSendCommand( dev, "get measpos", response ) ) {
                printf("--- ERROR: Could not query current position");
        }
        else {
                ParseFloatVec( response, CurrentPosition[PosX], CurrentPosition[PosY],
                CurrentPosition[PosZ] );
        }


        SampleNr++;
        if (SampleNr >= MaxSamples)
                SampleNr = 0;

*/
        //if statement from page 94 of the Moog HapticAPI Programming Manual to get the XYZ pos of the end effector
        if ( haSendCommand ( dev, "get measpos", response ) ) {
                printf("--- ERROR: Could not query current position");
        }
        else {
                ParseFloatVec( response, CurrentPosition[PosX], CurrentPosition[PosY], CurrentPosition[PosZ] );
                //ParseFloatVec( response, ParamSamples[0] [SampleNr], ParamSamples[1] [SampleNr], ParamSamples[2] [SampleNr] );
        }

        /*

        //if statement to get the gimbal position 1 direction
        if ( haSendCommand ( dev, "get pot1dir", response ) ) {
                printf("--- ERROR: Could not query current position");
        }
        else {
                ParseFloatVec( response, pot1dir[PosX], pot1dir[PosY], pot1dir[PosZ]);
                //ParseFloatVec( response, ParamSamples[0] [SampleNr], ParamSamples[1] [SampleNr], ParamSamples[2] [SampleNr] );
        }

        */
       
        //EEPos[0] = ParamSamples[0][SampleNr];
        //EEPos[0] = response[0];
        //EEPos[0] = CurrentPosition[PosX];
        //EEPos[1] = ParamSamples[1] [SampleNr];
        //EEPos[1] = response[1];
        //EEPos[1] = CurrentPosition[PosY];
        //EEPos[2] = ParamSamples[2] [SampleNr];
        //EEPos[2] = response[2];
        //EEPos[2] = CurrentPosition[PosZ];
       

        //Code from Basic Tutorial 4

        //if (mKeyboard->isKeyDown(OIS::KC_I)) // Forward
        if (CurrentPosition[PosX] > 0 & (mKeyboard->isKeyDown( OIS::KC_LSHIFT ))) // Forward ADDED LSHIFT DOWN TO DEBUG - REMOVE + EXTRA ()'S
        //if (EEPos[0] > 0.0) // Forward
    {
        //transVector.z -= mMove;
                transVector.z += mMove;
    }
    //if (mKeyboard->isKeyDown(OIS::KC_K)) // Backward
        if (CurrentPosition[PosX] < 0 & (mKeyboard->isKeyDown( OIS::KC_LSHIFT ))) // Backward ADDED LSHIFT DOWN TO DEBUG - REMOVE + EXTRA ()'S
        //if (EEPos[0] < 0.0) // Backward
    {
        //transVector.z += mMove;
                transVector.z -= mMove;
    }
    //if (mKeyboard->isKeyDown(OIS::KC_J)) // Left - yaw or strafe
        if (CurrentPosition[PosY] < 0) // Left - yaw or strafe
    {
        if(mKeyboard->isKeyDown( OIS::KC_LSHIFT ))
        {
            // Yaw left
            //mSceneMgr->getSceneNode("headNode")->yaw(Ogre::Degree(mRotate * 5));
                        transVector.x -= mMove; // MOVE THIS INTO ELSE ONLY USED HERE FOR DEBUG
        } else {
                        //transVector.x -= mMove;
        }
    }
    //if (mKeyboard->isKeyDown(OIS::KC_L)) // Right - yaw or strafe
        if (CurrentPosition[PosY] > 0) // Right - yaw or strafe
    //if (EEPos[1] > 0.0) // Right - yaw or strafe
    {
        if(mKeyboard->isKeyDown( OIS::KC_LSHIFT ))
        {
            // Yaw right
            //mSceneMgr->getSceneNode("headNode")->yaw(Ogre::Degree(-mRotate * 5));
                        transVector.x += mMove; // MOVE THIS INTO ELSE ONLY USED HERE FOR DEBUG
        } else {
            //transVector.x += mMove; // Strafe right
                        //transVector.x += mMove;
        }
    }
    //if (mKeyboard->isKeyDown(OIS::KC_U)) // Up
        if (CurrentPosition[PosZ] > 0 & (mKeyboard->isKeyDown( OIS::KC_LSHIFT ))) // Up ADDED LSHIFT DOWN TO DEBUG - REMOVE + EXTRA ()'S
        //if (EEPos[2] > 0.0) // Up
    {
        //transVector.y += mMove;
                transVector.y += mMove;
    }
    //if (mKeyboard->isKeyDown(OIS::KC_O)) // Down
        if (CurrentPosition[PosZ] < 0 & (mKeyboard->isKeyDown( OIS::KC_LSHIFT ))) // Down ADDED LSHIFT DOWN TO DEBUG - REMOVE + EXTRA ()'S
        //if (EEPos[2] < 0.0) // Down
    {
        //transVector.y -= mMove;
                transVector.y -= mMove;
    }






       
        //headNode->move(CurrentPosition[PosX], CurrentPosition[PosY], CurrentPosition[PosZ]);
       

        mSceneMgr->getSceneNode("headNode")->translate(transVector * evt.timeSinceLastFrame, Ogre::Node::TS_LOCAL);

    return true;


}


farshizzo 04-23-2012 05:25 PM

It sounds like you will need to create a class that extends the "VizExtensionSensor" interface. Sensors are objects that provide position, orientation, button, and analog data. Your sensor class would just override the "getPosition" and "getOrientation" methods. You would then override the "createSensor" method of your VizExtension class to provide access to your sensor class.

Once this is implemented, the Python code for retrieving the position of the haptic device would look something like this:
Code:

# Load HapticMaster extension
HapticMaster = viz.add('HapticMaster.dle')

# Create end effector sensor
endEffector = HapticMaster.addSensor()

# Get end effector position
pos = endEffector.getPosition()

# (Optional) Link a model to the end effector sensor
import vizshape
sphere = vizshape.addSphere(0.1)
viz.link(endEffector,sphere)

Let me know if anything is unclear.

pwsnow 04-24-2012 05:13 AM

Many thanks for the reply, your post does clear up alot!

I do have further questions though.

I've followed that guide I posted in my first post and set up the SDK in my copy of VS2008 (thankfully I had a copy!). I've added the code to create a dle file to test if the sample code works (which it does) although I get a pop up asking me about a Executable for the debug session. My question here is how would I add the .h files for the HapticMaster? Would this be via the "Project -> Properties" way or hard coded into "MyExtension.cpp"? I'm guessing so long as the cpp file finds the HapticMaster commands it doesn't matter.

I take it that overwriting getPosition, getOrientation etc would be coded in MyExtension.cpp? The getPosition part will be easy I would use the code I used in my Ogre app, however I have a ADL Gimball attached to the HapticMaster which allows the arm to be controlled www.youtube.com/watch?v=RvXGnFky3c which uses 3 pot sensors to gather additional values. I take it that the code from the HapticMaster API to get these 3 values should be stored into getOrientation?

Connection to the HapticMaster needed to obtain these values via a network connection. I take it just like my Ogre app I define this connection along with any other variables at the top of MyExtension.cpp just like any program?

I'll have a go now and try to get it working, thanks to your post it seems alot clearer. It also means that now I know I can get the HapticMaster to work with Vizard4.0 we can put an order in.

Many thanks, and I'm sure I will be back with any questions etc

farshizzo 04-24-2012 05:21 PM

Quote:

My question here is how would I add the .h files for the HapticMaster? Would this be via the "Project -> Properties" way or hard coded into "MyExtension.cpp"? I'm guessing so long as the cpp file finds the HapticMaster commands it doesn't matter.
The easiest way would be to simply place the .h file in the same directory as the Visual Studio project for your extension.

Quote:

I take it that overwriting getPosition, getOrientation etc would be coded in MyExtension.cpp? The getPosition part will be easy I would use the code I used in my Ogre app, however I have a ADL Gimball attached to the HapticMaster which allows the arm to be controlled www.youtube.com/watch?v=RvXGnFky3c which uses 3 pot sensors to gather additional values. I take it that the code from the HapticMaster API to get these 3 values should be stored into getOrientation?
No, getPosition/getOrientation are methods of the VizExtensionSensor class, so you would override them in the MySensor class. The orientation data should be specified as a quaternion, so you will need to perform an appropriate conversion from the values provided by the HapticMaster.

Quote:

Connection to the HapticMaster needed to obtain these values via a network connection. I take it just like my Ogre app I define this connection along with any other variables at the top of MyExtension.cpp just like any program?
I'm not familiar with the HapticMaster API, but if they have a function for connecting and disconnecting, then the most appropriate place for calling these functions would probably be within the constructor/destructor of the MyExtension class.

Is the HapticMaster API publicly available?

pwsnow 04-25-2012 05:20 AM

Many thanks for your reply.

That has cleared it up further, probably a way to go though!

I take it that once the program has compiled and run producing the correct dle file the program is no longer needed?

I'll use the sensor VS project files and extend it to work with the HapticMaster.

The HapticMaster API isn't publicly available. But I can message you with an attachment with some HapticMaster API documentation. I'll do that now too. If its any help that would be great.

I'll keep working on it and come back.

Many thanks again.

pwsnow 05-08-2012 12:39 PM

I didn't know if I should post this as a new topic but I guess its a continuation of this topic.

I have managed to get both the HapticMaster and Gimbal working with Vizard 4.0. So now we have 3DOF from the HapticMaster and 3DOF from the Gimbal making in total a 6DOF input into Vizard :)

I've searched the forums but I can't find much about the inverse kinematic set up in Vizard 4.0. Currently I am trying to (and have succeed to a point) in moving the default character's (provided in the Vizard demo) hand to follow a point using lookAt. My problem is that in order to get the following other limbs (Right hand side only) would I use lock() etc and parent nodes in order to automatically kick in inverse kinematics, or would I have to do it by hand in order to get the movements working correctly?

Any help or examples would be gratefully received.

j.ronner 01-29-2013 07:50 AM

Hi pwsnow,

We ordered a HapticMaster and also want to get it working in Vizard. Is it possible to share your solution with us?

Best,
Jacco

pwsnow 03-03-2014 03:44 AM

Quote:

Originally Posted by j.ronner (Post 14590)
Hi pwsnow,

We ordered a HapticMaster and also want to get it working in Vizard. Is it possible to share your solution with us?

Best,
Jacco

Apologies for not replying. I have been working on a IK solver for the HapticMaster(position & orientation) for a while and I am very nearly there. Only a few things left to solve. Currently I'm working on a static position and orientation, to test the IK for debugging without the HapticMaster endeffector position and gimbal orientation. But I have separately managed to get both correctly working with Vizard.

Drop me a PM and we can exchange information. Once I have the IK fully working I will post the code here on the forum. The solver I am working on takes a vector position and a Quat orientation so can be used by both HapticMaster and manual input (mouse etc etc).

Regards,

Peter


All times are GMT -7. The time now is 11:29 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
Copyright 2002-2023 WorldViz LLC