New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dependencies on indigo? #2
Comments
Hi broesdecat sorry for this very late response. This project is highly dependant on other repositories that we have and is not meant yet to be usable as a standalone. However I would be interested to know what would be your usage. I might be able to guide you on the installation if I know which part you want to reuse. The basic concept is having a human model that you can move (FK and IK) and this does not require much of dependencies. Try the demo.launch to see if you can already start this. Other launchfiles would allow you to calibrate the human model for recording using Kinect or Optitrack cameras but this is trickier to setup. Thanks for you interest in this project. |
Hi, I would like to use calibration and tracking of human just with Optitrack and visualize human in Rviz. Thank you in advance! |
Hi,
Thanks for your interest in this. I am sorry it is poorly documented it is
not made for a release at the moment. The general idea is that we create a
personalized human model from a Kinect skeleton and fit optitrack frames on
it if you are using optitrack as the main sensor. If you don't have a
kinect or want to use a generic model, I can push the missing file or you
can create it with those values:
{
"foot_length": 0.1,
"forearm_length": 0.20139506687084419,
"hand_length": 0.1,
"head_radius": 0.1,
"hip_offset_height": 0.06097154425004092,
"hip_offset_width": 0.07859968721994195,
"neck_length": 0.1568653864845268,
"neck_radius": 0.04,
"shin_length": 0.26128449188350694,
"shoulder_offset_height": -0.008297309670016684,
"shoulder_offset_width": 0.1579177854494185,
"spine_down_length": 0.04547537608402022,
"spine_down_radius": 0.1179177854494185,
"spine_up_length": 0.16739251552466408,
"spine_up_radius": 0.13791778544941852,
"thigh_length": 0.35401409759420954,
"thigh_radius": 0.05,
"torso_length": 0.060975937259304634,
"torso_radius": 0.1579177854494185,
"upper_arm_length": 0.18521827187060153
}
The given value corresponds to the default URDF pushed on the repo. Now
about the optitrack you would need to have multiple frames attached to the
human body:
- /opt/human/base (located on the waist of the subject)
- /opt/human/head
- /opt/human/shoulder_center (located at the torso level)
- /opt/human/right_elbow
- /opt/human/left_elbow
- /opt/human/right_hand
- /opt/human/left_hand
Those are the name of the tf transforms you need to publish from optitrack.
Then you can use the script *calibrate_human_model* in script folder. It
will prompt you to generate a new model which you can skip. It will then
launch the calibration process. The calibration will ask the subject to
place himself in a T pose (both arm horizontal). If every frames are
visible you should see numbers displayed with decreasing values. A proper
calibration would be achieved when the numbers are below 1. There is no
verification process so you would need to restart the process if it does
not reach desired values. Once the optitrack frames are calibrated you
should be able to launch *human_tracker.launch* with the rviz option and
see the moving model.
Some things are worth noting as I don't know what you are trying to achieve:
- This will definitely be less efficient and precise than the optitrack
module for human tracking. It is just a workaround as the optitrack
solution is too expensive and requires a full suit which was not optimal
for our experiments.
- Only the upper body is tracked. The model should be moving around in the
workspace thanks to the tracking of the base (waist) frame but the legs
will be static. You should be able to see motions of the arms.
Hope that helps.
Cheers.
Baptiste.
2017-11-16 15:24 GMT+01:00 brinij <notifications@github.com>:
… Hi,
I would like to use calibration and tracking of human just with Optitrack
and visualize human in Rviz.
In human_tracker.launch it is failing to load $(find
human_moveit_config)/tmp/human_length.json" because there is no tmp
folder. My question is: should I somehow create that file, or is it created
automatically after the calibration? And do you know what are preconditions
to run calibration.launch?
Thank you in advance!
Cheers.
—
You are receiving this because you were assigned.
Reply to this email directly, view it on GitHub
<#2 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AIkdhYJ2qlaF-U9Fj6qJHRmzdQqM0N6Eks5s3EWlgaJpZM4MAglF>
.
--
Baptiste Busch
Doctorant FLOWERS team
06.83.32.02.50
|
What about versioning this default JSON on git so that users can potentially run without prior calibration? @buschbapti |
Thanks Baptiste, |
Hi,
I just checked out the repository in my indigo catkin repository and after installing trac_ik and nlopt, it builds without issues.
However, when I try to launch it, only the display_urdf.launch succeeds, most others give various other issues.
Are there additional dependencies I should install or launch the files in a specific order?
Thx!
Broes
The text was updated successfully, but these errors were encountered: