Skip to content
This repository has been archived by the owner on Mar 17, 2021. It is now read-only.

Add instructions for mocap fusion using EKF2 #573

Merged
merged 4 commits into from
Dec 3, 2018

Conversation

mzahana
Copy link
Contributor

@mzahana mzahana commented Aug 6, 2018

This PR updates the instructions on setting up EKF2 to fuse MOCAP data using MAVROS.

* Align your robot's forward direction with the the system +x-axis
* Define a rigid body in the Motive software. Give the robot a name that does not contain spaces, e.g. `robot1` instead of `Rigidbody 1`
* Enable Frame Brodacst and VRPN streaming
* Set the Up axis to be the Z axis (the default is Y)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it obvious how to do this in the MOCAP software?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will try to get some demonstrative pictures.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it still makes sense to add some pictures (besides the videos).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mzahana We merged this, but do you still think you might be able to get some images?


### OptiTrack MOCAP

The following steps explain how to feed position estimates from an OptiTrack system to PX4. It is assumed that the MOCAP system is calibrated.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. Can you tell me a good URL for getting an OptiTrack system /the system you used?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hamishwillee
Copy link
Collaborator

@mzahana Thanks for this. I needs a subedit, which I can do afterwards.

@TSC21 I'm going to tidy the text, but can you do a review to make sure that all your open technical questions would be answered by this? Ie it captures the settings you think should be set?

@TSC21
Copy link
Member

TSC21 commented Aug 8, 2018

@TSC21 I'm going to tidy the text, but can you do a review to make sure that all your open technical questions would be answered by this? Ie it captures the settings you think should be set?

Yes I will check it today.

@mzahana
Copy link
Contributor Author

mzahana commented Aug 10, 2018

@hamishwillee I added some links to the OptiTrack setup steps

@hamishwillee
Copy link
Collaborator

Thanks @mzahana - I'll look at this again after @TSC21 tells me that all the required technical information is present - Nuno?


* External position estimate can be enabled by setting the `EKF_AID_MASK` to enable position and yaw fusion
* To use the external height estimate for altitude estimation, set `EKF2_HGT_MODE` to use vision
* Adjust the `EKF2_EV_DELAY` paramter according to how fast you receive the external position data with respect to the flight controller's IMU. Reduce the value of this paramter if you are getting external data at high rate, e.g. `50ms`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not entirely correct. This value actually represents how far off the timestamp of the measurement is off from the "actual" time it was captured at. It can technically be set to 0 if there is correct timestamping (not just arrival time) and timesync (e.g NTP) between mocap and ROS computers. In reality, this needs some empirical tuning since delays in the entire Mocap->PX4 chain are very setup specific and there is rarely a well setup system with an entirely synchronised chain.

Empirical tuning involves looking at the EKF innovations during dynamic maneuvers, and doing a parameter search for this value which yields the lowest innovations.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably I interpreted this incorrectly, but according to the docs here, EKF2_EV_DELAY is the Vision Position Estimator delay relative to IMU measurements.

Copy link
Contributor

@ChristophTobler ChristophTobler Aug 13, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mhkabir is right. It is the delay/offset and has nothing to do with the rate.
You could also check for that delay in the logs by looking at the IMU rates and the EV rates check the offset.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ChristophTobler What I understood from @mhkabir is that it is not related to IMU!! Instead it is related to the correct stamping of the mocap data. How is it related to the IMU then?

I guess it is related to the fusion using the EKF, but now I'm not sure whether the EKF2_EV_delay is the time offset between the IMU and vision stamping, or something else!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is related to the IMU as the IMU time stamping is the "base clock" for the EKF.

I guess it is related to the fusion using the EKF, but now I'm not sure whether the EKF2_EV_delay is the time offset between the IMU and vision stamping, or something else!

It's exactly that.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's the difference between the "base clock" as Christoph describes.

Copy link
Contributor

@ChristophTobler ChristophTobler Aug 14, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can get a rough estimate by looking at a log (see picture)
ev_delay

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mzahana So can you fix up according to @mhkabir comment above : #573 (comment)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ChristophTobler Perhaps a dumb question, but how do you get this log graph? Specifically, the legend at the bottom for the red and green lines has the same text, so looks like you're comparing like with like (even though I know one must be the IMU rate and the other must be the external vision rate)

MAVROS provies a plugin to relay pose data published on `/mavros/vision_pose/pose` to PX4. Assuming that MAVROS is running, you just need to relay the pose topic that you get from MOCAP `/vrpn_client_node/<rigid_body_name>/pose` directly to `/mavros/vision_pose/pose`. This can be easily done using the `topic_tools` package as follows,

```bash
rosrun topic_tools relay /vrpn_client_node/<rigid_body_name>/pose /mavros/vision_pose/pose
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unnecessary. Just remap the topic instead of relaying, which adds latency.

@hamishwillee
Copy link
Collaborator

@mhkabir @mzahana Can you guys tell me what this is waiting on for merging? I see a comment here: #573 (comment) - anything else?

@@ -10,9 +10,20 @@ The system can then be used for applications such as position hold indoors or wa

For vision, the MAVLink message used to send the pose data is [VISION_POSITION_ESTIMATE](https://mavlink.io/en/messages/common.html#VISION_POSITION_ESTIMATE) and the message for all motion capture systems is [ATT_POS_MOCAP](https://mavlink.io/en/messages/common.html#ATT_POS_MOCAP) messages.

The mavros ROS-MAVLink interface has default implementations to send these messages. They can also be sent using pure C/C++ code and direct use of the MAVLink library. The ROS topics are: `mocap_pose_estimate` for mocap systems and `vision_pose_estimate` for vision. Check [mavros_extras](http://wiki.ros.org/mavros_extras) for further info.
The mavros ROS-MAVLink interface has default implementations to send these messages. They can also be sent using pure C/C++ code and direct use of the MAVLink library. The ROS topics are: `/mavros/mocap/pose` for mocap systems and `/mavros/vision_pose/pose` for vision. Check [mavros_extras](http://wiki.ros.org/mavros_extras) for further info.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the /mavros/mocap/pose -> ATT_POS_MOCAP -> vehicle_mocap_odometry pipeline doens't work for ekf2. Ekf2 doesn't subscribe to the vehicle_mocap_odometry uORB topic, only vehicle_visual_odometry. And i don't think there is a path from ATT_POS_MOCAP to vehicle_visual_odometry.., correct?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct. You can also use ODOMETRY Mavlink message to propagate data in vehicle_mocap_odometry, but that won't be subscribed in EKF2. The only way, for now, is to actually use VISION_POSITION_ESTIMATE or ODOMETRY with the proper frame_id set. You can always remap mocap topics to to respective vision topics in MAVROS if you want. The only estimators using mocap data are LPE+AEQ.

@hamishwillee
Copy link
Collaborator

@TSC21 Can this be merged. If not, what is still required?

@mzahana
Copy link
Contributor Author

mzahana commented Oct 29, 2018

@hamishwillee sorry for the late reply, but I am busy these days. I made some changes according to the comments. Please check. Thanks.

Copy link
Member

@TSC21 TSC21 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It needs to be stated that the ODOMETRY Mavlink message can also be used to feed mocap data into the FCU. This can be done using the odom plugin of MAVROS.

@hamishwillee
Copy link
Collaborator

Thanks @mzahana . There are a couple of new comments. The most important one is c616fd9#r228903058
Once that is done this can be merged. If you have missed any typographic errors I will fix them as a post process.

@bramsvs
Copy link
Contributor

bramsvs commented Nov 1, 2018

Correct. You can also use ODOMETRY Mavlink message to propagate data in vehicle_mocap_odometry, but that won't be subscribed in EKF2. The only way, for now, is to actually use VISION_POSITION_ESTIMATE or ODOMETRY with the proper frame_id set. You can always remap mocap topics to to respective vision topics in MAVROS if you want. The only estimators using mocap data are LPE+AEQ.

It needs to be stated that the ODOMETRY Mavlink message can also be used to feed mocap data into the FCU. This can be done using the odom plugin of MAVROS.

@TSC21 Can you say how to set the proper frame_id? Also, how can you easily remap, for example, from a (vrpn_clien_ros/mocap) geometry_msgs/PoseStamped message type to a nav_msgs/Odometry type?

There isn't any documentation on the odom plugin, so i'm a little bit lost on this.

(...is it a good idea to also subscribe to the vehicle_mocap_odometry uORB topic in the ekf2 module? Why isn't that done at this moment? Then we can enable the /mavros/mocap/pose -> ATT_POS_MOCAP -> vehicle_mocap_odometry pipeline again which doesn't break compatibility with existing mocap ros nodes publishing geometry_msgs/PoseStamped messages.)

@hamishwillee
Copy link
Collaborator

@mzahana Can you please let me know ETA for this to be ready to merge? There were just a couple of minor points outstanding, so it would be great to get this in?

@hamishwillee
Copy link
Collaborator

hamishwillee commented Dec 3, 2018

@TSC21 This has stalled, and there is an open question for you here: #573 (comment)

Do you think you can help push it to conclusion - ie merge and do final updates?

After this is in, I would like to split reconstruct it again and merge/destroy this additional topic https://dev.px4.io/en/tutorials/motion-capture-vicon-optitrack.html .
The way I want to do this is split out the concepts properly to differentiate between MoCap and VIO, to clarify what the messages are for each (or if they are the same, to make that clear), and to separate out what you do with ROS vs without.

The ROS topic might then just link to the common information in a page under COmputer vision. i.e. it is ROS configuration rather than the whole shebang.

Reasonable?

@mzahana
Copy link
Contributor Author

mzahana commented Dec 3, 2018

@hamishwillee I apologize for not being responsive in real-time as I don't usually get time to follow up these days.

I did some changes as requested. Please check and let me know if there are further ones.

@TSC21 I am not currently aware of the odom plugin and have not worked with it. Please feel free to add your modifications.

Thanks.

@TSC21
Copy link
Member

TSC21 commented Dec 3, 2018

@TSC21 Can you say how to set the proper frame_id? Also, how can you easily remap, for example, from a (vrpn_clien_ros/mocap) geometry_msgs/PoseStamped message type to a nav_msgs/Odometry type?

For EKF2, the only frame_id supported is the one that matches MAV_FRAME_VISION_NED, since EKF2 only subscribes to vehicle_vision_position messages.
You can't remap between two different types of messages. You can though add a node in between that converts PoseStamped messages to Odometry messages. My question is: why would you want that since you just have pose data, without any covariances? You can just use the VISION_POSITION_ESTIMATE plugin.

(...is it a good idea to also subscribe to the vehicle_mocap_odometry uORB topic in the ekf2 module? Why isn't that done at this moment? Then we can enable the /mavros/mocap/pose -> ATT_POS_MOCAP -> vehicle_mocap_odometry pipeline again which doesn't break compatibility with existing mocap ros nodes publishing geometry_msgs/PoseStamped messages.)

It's not implemented because there wasn't any need of doing it, as people can just use the vision topics. There's no break of compatibility, you just need to use the proper topics and mapping if required. The Firmware code is not written to fit the ROS side, it's the other way around.

@TSC21
Copy link
Member

TSC21 commented Dec 3, 2018

The ROS topic might then just link to the common information in a page under COmputer vision. i.e. it is ROS configuration rather than the whole shebang.

Reasonable?

Makes total sense @hamishwillee.

@TSC21
Copy link
Member

TSC21 commented Dec 3, 2018

@TSC21 I am not currently aware of the odom plugin and have not worked with it. Please feel free to add your modifications.

OK I think this can be added further on.

@hamishwillee
Copy link
Collaborator

@mzahana @TSC21 @bramsvs

Thanks all. I'm going to merge this now. We can iterate from this to improve.

@hamishwillee hamishwillee merged commit fedae04 into PX4:master Dec 3, 2018
@hamishwillee
Copy link
Collaborator

hamishwillee commented Dec 4, 2018

In preparation for my next steps ...

@TSC21 Here are some statements. Can you please confirm that they are true:

  • VIO (Visual Inertial Odometry) gets vehicle pose (local position + orientation, + velocity) from combination of visual odometry (position and velocity from onboard cameras) and IMU.
  • MoCap is positioning from external cameras (watching the vehicle). Similar information to VIO - ie pose, but does not supply velocity
  • PX4 supports these MAVLink messages for MoCap AND the resulting uorb topics are only handled by LPE: ODOMETRY, ATT_POS_MOCAP
  • PX4 supports these MAVLink messages for VIO AND the resulting uorb topics are only handled by EKF: VISION_POSITION_ESTIMATE
    • ie LPE does not handle "Vision" and EKF does not handle MoCap?
  • You stated that You can always remap mocap topics to to respective vision topics in MAVROS if you want.
    • You're basically saying that if you have MoCap data but you want to use EKF2 then you have to map the MOCAP topics to VISION_POSITION_ESTIMATE in ROS right? Are there instructions on how to do that?

Below are my guess at the paths. What are the ? values?
MAVROS topic -> MAVLInk message -> uORB topic
LPE paths
/mavros/mocap/pose -> ATT_POS_MOCAP -> vehicle_mocap_odometry
? -> ODOMETRY _ vehicle_mocap_odometry
EKF2 paths
/mavros/vision_pose/pose -> VISION_POSITION_ESTIMATE -> ?

@TSC21
Copy link
Member

TSC21 commented Dec 4, 2018

  • PX4 supports these MAVLink messages for VIO AND the resulting uorb topics are only handled by EKF: VISION_POSITION_ESTIMATE
  • ie LPE does not handle "Vision" and EKF does not handle MoCap?

Not exactly. Both VIO and MoCap are ways of getting visual feedback of the the vehicles pose/odometry. The main difference has to be the frame perspective: usually Visual Odometry (or Visual Inertial Odometry) gets the data from the vehicles perspective, since the sensors are onboard the vehicle - this is called egomotion. On the other hand, MoCap systems use a system of cameras to determine the pose of a certain entity in a 3D space, so the sensors are not on the vehicle, but it's rather an external system that tells to the vehicle "you are here, with this orientation".
In the case of LPE, it's possible to feed this visual estimation through both MoCap and Vision pipelines. So this would be, normally:

  • /mavros/mocap/pose -> ATT_POS_MOCAP -> vehicle_mocap_odometry
  • /mavros/vision_pose/pose -> VISION_POSITION_ESTIMATE -> vehicle_visual_odometry

Though, we also have the odom plugin form MAVROS, which also allows the bellow pipelines:

  • /mavros/odometry/odom -> ODOMETRY -> vehicle_mocap_odometry
  • /mavros/odometry/odom -> ODOMETRY -> vehicle_visual_odometry

The differentiation between the two is done on the PX4 mavlink_receiver according to the frame_id of the message - if set to MAV_FRAME_VISION_NED, it goes to vehicle_visual_odometry, if set to MAV_FRAME_MOCAP_NED, then it goes to vehicle_mocap_odometry.

For EKF2, though, only the "vision" pipeline is allowed, meaning:

  • /mavros/vision_pose/pose -> VISION_POSITION_ESTIMATE -> vehicle_visual_odometry
  • /mavros/odometry/odom -> ODOMETRY -> vehicle_visual_odometry

What this means is that if you want to feed MoCap data to PX4 to be used by EKF2, you have two ways:

  • If your MoCap ROS topic is of type geometry_msgs/PoseWithCovarianceStamped or geometry_msgs/PoseStamped (the usual case), you need to remap it to /mavros/vision_pose/pose. This is the most common case, as usually MoCap doesn't have associated covariances to the data.
  • If you rather have a way of obtaining data through a nav_msgs/Odometry ROS message, then you would also need to remap it to /mavros/odometry/odom.

VISION_SPEED_ESTIMATE is supported by MAVROS, but not supported on upstream PX4. So not relevant currently.

Hoping this clarifies it.

@TSC21
Copy link
Member

TSC21 commented Dec 4, 2018

  • You're basically saying that if you have MoCap data but you want to use EKF2 then you have to map the MOCAP topics to VISION_POSITION_ESTIMATE in ROS right? Are there instructions on how to do that?

Regarding that, that's something you get from ROS docs: http://wiki.ros.org/roslaunch/XML/remap

@hamishwillee
Copy link
Collaborator

Hoping this clarifies it.

@TSC21 Excellent answer. Thank you.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants