Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introduce support for Kinova Jaco2 robot #625

Closed
wants to merge 18 commits into from

Conversation

fspindle
Copy link
Contributor

@engr-zubair I introduced all the CMake matérial and files so that you can start to modify vpRobotJaco2 class a wrapper over Jaco2 SDK. This initial work follows PR #619 that you initiate.

engr-zubair and others added 6 commits September 1, 2019 14:12
Cmake file for FindJaco2.cmake ..to find JACO 2 HEADERS AND  LIBRARIES. PLEASE check , edit and include for jaco2 robot. Thanks
Zubair Arif
Harbin Institute of Tech, HIT CHINA
updated to add use _jaco2 option at several places in code. please check rectify and include in main code for using kinova jaco2 robot with visp.
CHANGES ARE MADE IN these FILES to include and use visp_have _jaco2 options
- This work relates PR lagadic#619 initiated by Zubair Arif research scholar at Harbin institute of technology
@fspindle
Copy link
Contributor Author

@engr-zubair Don't forget to git pull to get my last commit before commiting your changes

@fspindle
Copy link
Contributor Author

Yes, the MIXT_FRAME is typically here to handle this specific use case.
In setCartVelocity() I suggest that you modify inside the following case block:

  case vpRobot::MIXT_FRAME:
    // We suppose here that the input velocity twist v is 6-dim with:
    // v[0], v[1], v[2] = vx, vy, vz which represent the end effector’s translation velocities in the base reference frame
    // v[3], v[4], v[5] = wx, wy, wz which correspond to the end effector’s rotation velocities in the effector reference frame
    // In practice you have just to wrap the values v[i] in the appropriate data structure and send the corresponding double values to the robot
    break;

@engr-zubair
Copy link

@fspindle please check cmake/ findjaco2 file has some error, i am unable to add / find jaco 2 libraries while CMake is generating the project. if I manually add the jaco2 libraries location still it does not link while building the project inside visual studio 2017.

@fspindle
Copy link
Contributor Author

This could be due to the library names that are suffixed by .dll in the FindJaco2.cmake file

Try to put:

NAMES CommandLayerEthernet CommandLayerWindows CommunicationLayerEthernet CommunicationLayerWindows

To debug, add also:

message("JACO2_INCLUDE_DIRS: ${JACO2_INCLUDE_DIRS}")
message("JACO2_LIBRARIES: ${JACO2_LIBRARIES}")

@engr-zubair
Copy link

@fspindle dear fabien, I have changed the names as you suggested of dll files and solved the potential build issue and got the right code for findjaco2.cmake file for CMake .
but the problem is not with the name of libraries , the problem is their extension, these libraries provided by the JACO ROBOT SDK are Dll files, while cmake and visual studio only want .lib extension library files not .Dll. ones.
Therefore, whenever i try to build VISP source with visual studio 2017 and use findjaco2.cmake, the robot321.lib does not build and gives error cannot open or read 'commandlayerwindows.dll' ... and the error comes in all other modules.
[ you can download libraries from kinova robot SDK here
https://www.kinovarobotics.com/en/knowledge-hub/all-kinova-products (SDK 1.5.1) ]

is there any other way to solve this issue ?, or may I suggest, if we can use dynamic linking methods like using Reference to this jaco2 .dll library in our project, will it work that way with VISP?

@fspindle
Copy link
Contributor Author

The Jaco SDK is working with plugins that are loaded during runtime. That's why they dont provide a classical .lib file besides the .dll library.

I initiated the work. I will continue tomorrow to fix small things that I have in mind. I will let you know when it's finished so that you can play with.

@fspindle
Copy link
Contributor Author

@engr-zubair The vpRobotKinova class is almost finished on my side. I let you test, fix and improve. There are 2 examples in example/servo-kinova, one to control the robot joints, the other one to control the robot in Cartesian space. The class should work with USB and Ethernet control layout.

  • In Cartesian I made the assumption that the velocity that is considered is the velocity of the end-effector expressed in the end-effector frame (and not in the base frame).
  • I also consider that the robot has 6 joints. From the SDK API it seems that 7 joints could be considered, but I didn't find a function that says that the device has 6 or 7 joints.
  • I did'nt introduce the fingers control. It should be easy adding new functions:
setPosition(const vpRobot::vpControlFrameType, const vpColVector &q, const vpColVector &q_finger)
setVelocity(const vpRobot::vpControlFrameType, const vpColVector &q, const vpColVector &v_finger)

where q_finger and v_finger are 3-dim vectors.

@engr-zubair
Copy link

@fspindle Thanks, I have gone through the changes , and now implementing it and test it with jaco 2 robot this week, will get back to you with results and suggestions.

@engr-zubair
Copy link

@fspindle ,hello, i have checked and tested the example given in kinova jaco SDK and found that the one ' Example_Ethernet_7DOF_Kinematics' is working in msvs detecting robot on ethernet while using same paramenters as given on SDK or manual.
however, i noted the difference between USB and ethernet commands and we have to initialize more functions and commands for ethernet load library on top of source code. you can refer to the source code of this example. thanks

@fspindle
Copy link
Contributor Author

Can you test my commit ?

@fspindle
Copy link
Contributor Author

@engr-zubair Any updates ?

@engr-zubair
Copy link

hello, yes, i was double checking it before letting you know of the update about your code in mixt:frame . the observations are as follows;

  1. the file you commit changes has some conflicts with my existing branch visp kinova , so it was not building getting errors while building, so i introduce the same changes you made in my kinovarobot.cpp file. and applied to kinovajacocartmove example.

  2. during build it was ok, but while using mixt frame code in kinovarobot.cpp file , the robot does not move but did not give any error. but it moves as the frame changed to end effector frame in same code/project.
    robot.setVelocity(vpRobot::MIXED_FRAME, v_e) // v_e = eVc * v_c // v_c =camera vel from control law
    robot.setVelocity(vpRobot::END_EFFECTOR_FRAME, v_mixed) // vpColVector v_mixed = bVe * v_e;

  3. if the same mixt frame code is used in main code of kinovajacocart.cpp it is working.

  4. the code n mixt frame or camera frame and transformation is right, i have tested it separately by mapping frames on real robot , the velocity is tranformed to endeffector frame from base frame.

  5. while I test the same code in frankaIBVS.example introducing and making changes for kinova robot and insert my camera emc extrinsic param, it is not working but not converging robot moving in and out leaving features while it is also not moving in mixt frame, but again moving in endeffector frame or camera frame.

  6. the problem I see is ...the mixt frame code is not working inside from kinovarobot.cpp but the same code is working outside in main code script of frankaIBVS.example.

  7. good thing is kinova is operating in ethernet mode with frankaibvs.exaple settings. which was not previously working in kinovajacomovecart.example. please also relook or provide another example to test code with mixt frame.
    thanks
    zubair

@fspindle
Copy link
Contributor Author

You should better validate the controller implemented in vpRobotKinova before trying a visual servoing like the one implemented in servoFrankaIBVS.cpp example.
There is servoKinovaJacoCart.cpp that I introduced to this end.

  1. This example allows to apply a cartesian velocity in the robot end-effector. By default it applies a velocity of -0.10 m/s along end-effector Y axis
    vcart[1] = -0.10; // send 10 cm/s on along Y axis 
    
  2. Does it work as expected ?
  3. Do the same test sending a velocity along X axis
    vcart[0] = -0.05; // send 5 cm/s on along X axis 
    vcart[1] = 0;
    
  4. Do the same test sending a velocity along Z axis
    vcart[0] = 0;
    vcart[1] = 0;
    vcart[2] = 0.05; // send 5 cm/s on along Z axis 
    
  5. Do the same test sending a velocity along X and Y axis
    vcart[0] = 0.05;
    vcart[1] = 0.05;
    vcart[2] = 0;
    
  6. Continue testing rotational velocities, first along X axis
    vcart[0] = 0;
    vcart[1] = 0;
    vcart[2] = 0;
    vcart[3] = vpMath::rad(5); // 5 deg/s along X axis
    
  7. Continue testing rotational velocities, first along Y and Z axis
    vcart[0] = 0;
    vcart[1] = 0;
    vcart[2] = 0;
    vcart[3] = 0;
    vcart[4] = vpMath::rad(5); // 5 deg/s along Y axis
    
  8. Then you can continue mixing translation and rotational velocities. At this point everything should work as expected.
  9. Then you have to continue and test cartesian velocities in the camera frame. To this end you can use the same servoKinovaJacoCart.cpp example and first test with a pure rotational velocity along the camera Z axis (also the optical axis) modifying
    vcart = 0;
    vcart[5] = vpMath::rad(5); // apply 5 deg/s along camera Z axis
    vpHomogeneousMatrix eMc;
    // Here update the content of your eMc matrix that gives the transformation between the
    // end-effector and the camera frame (where is the camera frame wrt the end-effector frame)
    // This matrix should differ from the one used in the Franka example
    robot.set_eMc(eMc);
    // Sent new joint velocities each 5 ms
    for (unsigned int i = 0; i < 300; i++) {
      // We send the velocity vector as long as we want the robot to move along that vector
      robot.setVelocity(vpRobot::CAMERA_FRAME, vcart);
      vpTime::wait(5);
    }
    
  10. If this is working, it will be immediate to adapt servoFrankaIBVS.cpp for the Jaco robot.

@engr-zubair
Copy link

hi , i have tested your code as per your guidance strictly following the instructions, I got following results using your kinovajacomovecart.example and kinovarobot.cpp file ;

  1. the code builds and runs without error
  2. [point 1-8] the vpservo::endeffector frame works fine except it has reverse sign/directions than same command used in kinova API.
  3. in mixt:frame the robot controller does'nt move at all , without giving any exception or error .
  4. the camera frame is working but the camera zooming axis is y axis in this code , camera moves in or forward when command y axis is changed, z-axis is camera vertical height and accordingly x coming out of screen =
    cam = > face ___________> y
    !
    !
    !
    Z
  5. please guide fur, thanks.

@fspindle
Copy link
Contributor Author

  • (2) I don't understand
  • (3) I don't see why you use mixt frame. You should only use end-effector and camera or tool frame
  • (4) you didn't modify the content of the eMc matrix that is by default identity when you call vpHomogeneousMatrix eMc;
  • (5) Impossible without more precise information
    • Attach an image of your setup (robot + camera)
    • identify where the end-effector axis are and draw them on a paper
      - For +X-end-effector , send only vcart[0] = 0.05 and see where it moves
      - For +Y-end-effector , send only vcart[1] = 0.05 and see where it moves
      - For +Z-end-effector , send only vcart[2] = 0.05 and see where it moves
    • make a schematic with the end-effector axis that you just identify and put also the camera frame axis (+ z-cam is along the optical axis, +x-cam is in the image plane on the right, +y-cam is in the image plane going down

@engr-zubair

This comment has been minimized.

@engr-zubair
Copy link

@fspindle , dear Fabien , I am waiting for your response, to progress further regarding the development of visual servo control (IBVS / PBVS) of kinova jaco2 robot.
hope you can find out some time this week.

@fspindle
Copy link
Contributor Author

What is the value of your eMc matrix ?

@engr-zubair

This comment has been minimized.

@fspindle
Copy link
Contributor Author

Considering the following image you send me:
kinova jaco- frames-zubair

  • I understand that drawing (2) end-effector and (3) your wrong camera frame have same orientation. In the drawing, it's just the point of view that differs. If I'm right, it means here that you used eMc = eye()
  • I understand also that the right camera frame corresponds to drawing (4). If I'm also right, your eMc matrix sounds wrong to me:
    -0.005742676471  -0.9978489581  -0.06530297456  0.05345567035
    0.9993569641  -0.003415469108  -0.03569303616  -0.009516983454
    0.03539321866  -0.06546595596  0.9972269194  -0.04510223842
    0  0  0  1
    
  • I my point of view it should be more like:
     0   0   1   0.05345567035
    -1   0   0  -0.009516983454
     0  -1   0  -0.04510223842
     0   0   0   1
    

@li2jq
Copy link

li2jq commented Jan 4, 2020

@engr-zubair Hi Zubair Arif, I am Jiqing Li, a student from ShenZhen Institutes of Advanced Technology Chinese Academy of Sciences. I am also going to do some research about Visual Servoing with Kinova Jaco2 robot and Realsense D435i. I browsed this whole pull request conversations and at the end you did not show any progress about the IBVS for the kinova Jaco2 robot. Can you share your recently progress about the IBVS for Jaco2? Hope you can find out some time to share your work. Thank you!

@engr-zubair
Copy link

hello, dear @li2jq . nice to hear from you. this code is still premature and in test phases and as per Fabien advice, I am still debugging the camera frame problem before we can move on to do visual servoying . for now this code is not supporting camera frame movement as desired, so please contribute or suggest any change which might be needed to correct this problem at present. .

could you please also look at if you can connect with your kinova jaco 2 with this code through ethernet interface.? thanks.

@fspindle
Copy link
Contributor Author

fspindle commented Jan 8, 2020

@engr-zubair Nice to here that camera frame, end effector and mixt frame are working as expected after a fresh code checkout and build

@li2jq
Copy link

li2jq commented Jan 9, 2020

@engr-zubair @fspindle , Hi Zubair and Fabian, before the visp, I only used kinova-ros to control the Jaco2, so I have no experience about the ethernet interface and also my project did not need it. Sorry, I did not spend time to do the test.
(I tried to do the extrinsic calibration for the Kinova Jaco2 and realsense 2 D435i these days, and for the last step calibration step(I already got the pairs files of pose_cPo_.yaml & pose_fPe_.yaml) I got an error.

And the eMc result is obviously incorrect, since the xyz value is extremely too large.
Could you give some instructions about how to solve this problem? )
I solved this problem and found out that it was caused by some wrong pose_fPe. I did not know the reason why some BASE_FRAME to END_EFFECTOR_FRAME transformation can be worng since it only uses the Kinova API to get the transformation. After I deleted all the three obvious wrong fPe transformations, I got the extrinsic transformation of the END_EFFECTOR_FRAME to the CAMERA_FRAME. However, it looks wrong since the z estimation is obvious too small. And before I learned VISP, I used a github project called easy_handeye calibration. And this result is quite different from the one I used VISP to estimate. And still, I am gonna to redo the extrinsic calibration to dig into the problem.
(Update: I redo the extrinsic calibration and found out the eMc final result will be changed as the parameter ndata changed. And the influence is substantial which shows the position Y value could have about 0.1m variation. Still, the eMc result is not reasonable, and I have not found the reason.)
Also I tested the servoKinovaJacoCart.cpp, and for the Cart velocity control, when I tried the MIXT_FRAME, the input value +x acted as +y, +y as +z and z+ as +x. And the movement is the same to the END_EFFECTOR_FRAME,and then I look in the vpRobotKinova.cpp file and find these two case codes are same, that's why they have the same movement. And the result for the END_EFFECTOR_FRAME case is correct, and for correct the MIXT_FRAME, just change the v_mis[0,1,2] to v_e[0,1,2]. The true code looks like below:
` // pointToSend.Position.CartesianPosition.X = static_cast(v_mix[0]);
// pointToSend.Position.CartesianPosition.Y = static_cast(v_mix[1]);
// pointToSend.Position.CartesianPosition.Z = static_cast(v_mix[2]);

pointToSend.Position.CartesianPosition.X = static_cast<float>(v_e[0]);
pointToSend.Position.CartesianPosition.Y = static_cast<float>(v_e[1]);
pointToSend.Position.CartesianPosition.Z = static_cast<float>(v_e[2]);
pointToSend.Position.CartesianPosition.ThetaX = static_cast<float>(v_e[3]);
pointToSend.Position.CartesianPosition.ThetaY = static_cast<float>(v_e[4]);
pointToSend.Position.CartesianPosition.ThetaZ = static_cast<float>(v_e[5]);`

Then the MIXT_FRAME is correct, which means the XYZ linear velocity is based on the BASE_FRAME and the theta XYZ angular velocity is based on the END_EFFECTOR_FRAME。
Also, there is an interesting bug that I cannot solve it. When I used Ubuntu 18.04 system to do the code-changing in vpRobotKinova.cpp, whatever I complied it whether delete all the build files, the changed code never work. Fortunately, I have another ubuntu16.04 system, and amazingly it was worked in the ubuntu 16.04 system. The possible explanation is the JACO-SDK, which just verified under ubuntu 14 and 16.
Since my extrinsic calibration problem the VVS error also comes from the ubuntu 18, I am gonna try it under the system ubuntu 16.

@fspindle
Copy link
Contributor Author

@li2jq Feel free to commit your changes in this PR or create a new PR. Since I don't have access to a Jaco robot, I cannot debug on my side.

@fspindle
Copy link
Contributor Author

fspindle commented Nov 2, 2020

@engr-zubair Any update is welcome. Do you finally succeed to control the Jaco arm with vpRobotJaco2 class ?

@engr-zubair
Copy link

hello @fspindle ,good to hear from you its long time since the covid-19 pandemic , the labs at my university were closed .
yes the vpRobotJaco2 class is working fine with the the jaco-2 robotic arm in end effector and camera frames , I have utilized it in different applications and settings.
For jaco-2-6 dof variant, code is working fine.. however , I have made changes in DOF of robot in vprobotkinova.cpp file as my jaco-2 arm is 7 dof and it creates problem without giving error when control is made on jaco-2 arm 7 dof arm.
this repo has some problem of code integration with your source VISP repo, so it needs to be revised and some changes needs to be undone before the same repo can be integrated in the main VISP repository . this needs your supervision.

@li2jq
Copy link

li2jq commented Nov 2, 2020

hello @fspindle, I succeed in testing the vpRobotJaco2 with my kinova Jaco2 j2n6s300 robot. However, I cannot reach the accuracy like your panda robot example. I guess it may be caused the kinova robot itself since at the end of the robot movement, the joint velocities are all pretty small that the kinova robot cannot operate. And below is the link that shows my robot test. Please check it and welcome any comments if you have any problems.
https://www.bilibili.com/video/BV1ay4y1r7as

@fspindle
Copy link
Contributor Author

fspindle commented Nov 2, 2020

Nice to hear from you. It sounds that we are very close to something that could be integrated to the next ViSP release.
To this end I have created a fresh PR #839 with the last version of the code and small changes to improve Jaco SDK cmake detection on Ubuntu. The class is now also compatible with 6 or 7 DoF variant.

@li2jq I would appreciate if you can test PR #839 with your 6-DoF Jaco robot

@engr-zubair I would appreciate if you can test PR #839 with your 7-DoF Jaco robot

I will close this PR. We can continue to discuss on PR #839.

@fspindle fspindle closed this Nov 30, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants