-
Notifications
You must be signed in to change notification settings - Fork 480
Add instructions for mocap fusion using EKF2 #573
Conversation
* Align your robot's forward direction with the the system +x-axis | ||
* Define a rigid body in the Motive software. Give the robot a name that does not contain spaces, e.g. `robot1` instead of `Rigidbody 1` | ||
* Enable Frame Brodacst and VRPN streaming | ||
* Set the Up axis to be the Z axis (the default is Y) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it obvious how to do this in the MOCAP software?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will try to get some demonstrative pictures.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it still makes sense to add some pictures (besides the videos).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mzahana We merged this, but do you still think you might be able to get some images?
|
||
### OptiTrack MOCAP | ||
|
||
The following steps explain how to feed position estimates from an OptiTrack system to PX4. It is assumed that the MOCAP system is calibrated. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- Can you tell me a good URL for getting an OptiTrack system /the system you used?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes I will check it today. |
@hamishwillee I added some links to the OptiTrack setup steps |
|
||
* External position estimate can be enabled by setting the `EKF_AID_MASK` to enable position and yaw fusion | ||
* To use the external height estimate for altitude estimation, set `EKF2_HGT_MODE` to use vision | ||
* Adjust the `EKF2_EV_DELAY` paramter according to how fast you receive the external position data with respect to the flight controller's IMU. Reduce the value of this paramter if you are getting external data at high rate, e.g. `50ms` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not entirely correct. This value actually represents how far off the timestamp of the measurement is off from the "actual" time it was captured at. It can technically be set to 0 if there is correct timestamping (not just arrival time) and timesync (e.g NTP) between mocap and ROS computers. In reality, this needs some empirical tuning since delays in the entire Mocap->PX4 chain are very setup specific and there is rarely a well setup system with an entirely synchronised chain.
Empirical tuning involves looking at the EKF innovations during dynamic maneuvers, and doing a parameter search for this value which yields the lowest innovations.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably I interpreted this incorrectly, but according to the docs here, EKF2_EV_DELAY
is the Vision Position Estimator delay relative to IMU measurements.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mhkabir is right. It is the delay/offset and has nothing to do with the rate.
You could also check for that delay in the logs by looking at the IMU rates and the EV rates check the offset.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ChristophTobler What I understood from @mhkabir is that it is not related to IMU!! Instead it is related to the correct stamping of the mocap data. How is it related to the IMU then?
I guess it is related to the fusion using the EKF, but now I'm not sure whether the EKF2_EV_delay
is the time offset between the IMU and vision stamping, or something else!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is related to the IMU as the IMU time stamping is the "base clock" for the EKF.
I guess it is related to the fusion using the EKF, but now I'm not sure whether the
EKF2_EV_delay
is the time offset between the IMU and vision stamping, or something else!
It's exactly that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's the difference between the "base clock" as Christoph describes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mzahana So can you fix up according to @mhkabir comment above : #573 (comment)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ChristophTobler Perhaps a dumb question, but how do you get this log graph? Specifically, the legend at the bottom for the red and green lines has the same text, so looks like you're comparing like with like (even though I know one must be the IMU rate and the other must be the external vision rate)
MAVROS provies a plugin to relay pose data published on `/mavros/vision_pose/pose` to PX4. Assuming that MAVROS is running, you just need to relay the pose topic that you get from MOCAP `/vrpn_client_node/<rigid_body_name>/pose` directly to `/mavros/vision_pose/pose`. This can be easily done using the `topic_tools` package as follows, | ||
|
||
```bash | ||
rosrun topic_tools relay /vrpn_client_node/<rigid_body_name>/pose /mavros/vision_pose/pose |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unnecessary. Just remap the topic instead of relaying, which adds latency.
@mhkabir @mzahana Can you guys tell me what this is waiting on for merging? I see a comment here: #573 (comment) - anything else? |
@@ -10,9 +10,20 @@ The system can then be used for applications such as position hold indoors or wa | |||
|
|||
For vision, the MAVLink message used to send the pose data is [VISION_POSITION_ESTIMATE](https://mavlink.io/en/messages/common.html#VISION_POSITION_ESTIMATE) and the message for all motion capture systems is [ATT_POS_MOCAP](https://mavlink.io/en/messages/common.html#ATT_POS_MOCAP) messages. | |||
|
|||
The mavros ROS-MAVLink interface has default implementations to send these messages. They can also be sent using pure C/C++ code and direct use of the MAVLink library. The ROS topics are: `mocap_pose_estimate` for mocap systems and `vision_pose_estimate` for vision. Check [mavros_extras](http://wiki.ros.org/mavros_extras) for further info. | |||
The mavros ROS-MAVLink interface has default implementations to send these messages. They can also be sent using pure C/C++ code and direct use of the MAVLink library. The ROS topics are: `/mavros/mocap/pose` for mocap systems and `/mavros/vision_pose/pose` for vision. Check [mavros_extras](http://wiki.ros.org/mavros_extras) for further info. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the /mavros/mocap/pose
-> ATT_POS_MOCAP
-> vehicle_mocap_odometry
pipeline doens't work for ekf2. Ekf2 doesn't subscribe to the vehicle_mocap_odometry
uORB topic, only vehicle_visual_odometry
. And i don't think there is a path from ATT_POS_MOCAP
to vehicle_visual_odometry
.., correct?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct. You can also use ODOMETRY
Mavlink message to propagate data in vehicle_mocap_odometry
, but that won't be subscribed in EKF2. The only way, for now, is to actually use VISION_POSITION_ESTIMATE
or ODOMETRY
with the proper frame_id
set. You can always remap mocap topics to to respective vision topics in MAVROS if you want. The only estimators using mocap data are LPE+AEQ.
@TSC21 Can this be merged. If not, what is still required? |
@hamishwillee sorry for the late reply, but I am busy these days. I made some changes according to the comments. Please check. Thanks. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It needs to be stated that the ODOMETRY
Mavlink message can also be used to feed mocap data into the FCU. This can be done using the odom
plugin of MAVROS.
Thanks @mzahana . There are a couple of new comments. The most important one is c616fd9#r228903058 |
@TSC21 Can you say how to set the proper There isn't any documentation on the (...is it a good idea to also subscribe to the |
@mzahana Can you please let me know ETA for this to be ready to merge? There were just a couple of minor points outstanding, so it would be great to get this in? |
@TSC21 This has stalled, and there is an open question for you here: #573 (comment) Do you think you can help push it to conclusion - ie merge and do final updates? After this is in, I would like to split reconstruct it again and merge/destroy this additional topic https://dev.px4.io/en/tutorials/motion-capture-vicon-optitrack.html . The ROS topic might then just link to the common information in a page under COmputer vision. i.e. it is ROS configuration rather than the whole shebang. Reasonable? |
@hamishwillee I apologize for not being responsive in real-time as I don't usually get time to follow up these days. I did some changes as requested. Please check and let me know if there are further ones. @TSC21 I am not currently aware of the Thanks. |
For EKF2, the only
It's not implemented because there wasn't any need of doing it, as people can just use the vision topics. There's no break of compatibility, you just need to use the proper topics and mapping if required. The Firmware code is not written to fit the ROS side, it's the other way around. |
Makes total sense @hamishwillee. |
OK I think this can be added further on. |
In preparation for my next steps ... @TSC21 Here are some statements. Can you please confirm that they are true:
Below are my guess at the paths. What are the ? values?
|
Not exactly. Both VIO and MoCap are ways of getting visual feedback of the the vehicles pose/odometry. The main difference has to be the frame perspective: usually Visual Odometry (or Visual Inertial Odometry) gets the data from the vehicles perspective, since the sensors are onboard the vehicle - this is called egomotion. On the other hand, MoCap systems use a system of cameras to determine the pose of a certain entity in a 3D space, so the sensors are not on the vehicle, but it's rather an external system that tells to the vehicle "you are here, with this orientation".
Though, we also have the
The differentiation between the two is done on the PX4 For EKF2, though, only the "vision" pipeline is allowed, meaning:
What this means is that if you want to feed MoCap data to PX4 to be used by EKF2, you have two ways:
Hoping this clarifies it. |
Regarding that, that's something you get from ROS docs: http://wiki.ros.org/roslaunch/XML/remap |
@TSC21 Excellent answer. Thank you. |
This PR updates the instructions on setting up EKF2 to fuse MOCAP data using MAVROS.