Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Recording a bag file for catographer #726

Closed
elgarbe opened this issue Feb 16, 2018 · 7 comments
Closed

Recording a bag file for catographer #726

elgarbe opened this issue Feb 16, 2018 · 7 comments

Comments

@elgarbe
Copy link

elgarbe commented Feb 16, 2018

So, I need a bag file with good laser scan data (I have an hokuyo lidar with urg_node) and optional IMU data (I have /mavros/imu/data and /mavros/imu/data_raw from pixhawk). I also have /mavros/odometry/odom from pixhawk.
Do I need to define TF, too?
Recording a bag file with that topics is all I need to run cartographer in my PC?
I would like to go to the river and record a bag file and later run cartogrpher on my PC.

Thank

@guilhermelawless
Copy link
Contributor

guilhermelawless commented Feb 26, 2018

I'm going to summarize what I went through to initially make this work for me. It may not be a 100% correct procedure, but it's working for me.

You need to have a TF tree connecting map_frame and tracking_frame (your laser frame, sometimes /odom if you don't have any other sensor see comment below). Cartographer publishes a transform between map_frame and published_frame. In our case, we have odometry, so the tracking_frame is different from published_frame.

To start, you can try to make a map using gmapping. If you can do that, then just record the Odom, IMU, Laser, TF and TF Static topics to your bag (locally on the robot, not your PC).

If you recorded while making a map with gmapping for some visualization, then you need to remove the /map to "some_frame" TF from the bag, or there will be conflicts when you run cartographer. You can do that using this script: https://github.com/srv/srv_tools/blob/kinetic/bag_tools/scripts/remove_tf.py (cartographer has something similar for online as tf_remove_frames.py

Now with your bag just play around with the config file, and also run cartographer_rosbag_validate to make sure your data is good - maybe it's not so good and you will be better off not using IMU, Odometry or either.

Finally, if you will run cartographer offline (bag batch processing), do not forget to set -keep_running true in the node arguments in your launch file.

@ojura
Copy link
Contributor

ojura commented Feb 26, 2018

tracking_frame (your laser frame, sometimes /odom if you don't have any other sensor)

Tracking frame is not odom, ever! If using an IMU, it has to be collocated with the IMU frame. In other cases, it is usually set to base_link.

@guilhermelawless
Copy link
Contributor

@ojura Right; edited my comment

@ojura
Copy link
Contributor

ojura commented Feb 26, 2018

One more detail:

You need to have a TF tree connecting map_frame and tracking_frame

This is done by cartographer, not the user. Cartographer publishes the map to tracking transform. What you need to have is a complete tree from the tracking frame to sensor frames.

@MichaelGrupp
Copy link
Contributor

MichaelGrupp commented Mar 1, 2018

  1. proper extrinsic calibration - if you don't know the transformations between your sensors all later steps are useless
  2. provide the extrinsic calibration as a static tf tree (e.g. using a robot_state_publisher)
    Example: if you have an IMU and 2 laser scanners, then you need to have a tree that connects the tracking frame with the frame_ids of the sensors, like this for example:
    |_ base_link
    |_ imu_frame
      |_ laser_1_frame
      |_ laser_2_frame
    
    Here, base_link is the tracking frame. As @ojura mentioned, the tracking frame needs to be collocated with the IMU if you use one. Therefore we place the IMU at the root of the extrinsic calibration tree. The transformation between the tracking frame base_link and imu_frame must be an identity, i.e. the two frames are collocated.
  3. record all sensor topics, tf and tf_static

I would like to go to the river and record a bag file [...]

😄 I don't exactly know what your plans are, but it sounds like you're using a drone? Just a single 2D Hokuyo will very likely give you bad results. At least the IMU shouldn't be optional but this won't give altitude.

@elgarbe
Copy link
Author

elgarbe commented Mar 1, 2018

Thank to all of you to help me!
I'm reading and trying to implement what you say, I will be back to you in a couple of days with results or more questions.
BTW, I'm using an ASV (autonomous surface vessel) and would like to get a 2D map of the coast. I already achive this with hector_mapping. But I would like to improve mapping adding some sensors. I have a pixhawk as FC, so I have IMU data (through mavros) and somo odemetry message (I have to investigate what data ardurover publish in odom message). The thing is I'm new on ROS and so in cartographer. I'm readin ghow to create my URDF file with the my sensors and their links right now.
I have a pixhawx and the ASV frame origin. Then I have a laser 0.5mts to the right of pixhawk and facing to Y axis, looking to the right of the vehicle. And thas all.
My companion computer is a raspberry pi 3 running ubuntu mate a ROS kinetic. I know that I can't run cartographer on PI so I will need to record a bag file and process it on my desktop PC.

@ojura
Copy link
Contributor

ojura commented Mar 1, 2018

I would just like to add to @MichaelGrupp’s that imu to tracking doesn‘t have to be identity - only rotation is ok (they are still collocated).

@cschuet cschuet closed this as completed Mar 10, 2018
doronhi pushed a commit to doronhi/cartographer_ros that referenced this issue Nov 27, 2018
…r-project#729)

I noticed that @jihoonl opened the PR cartographer-project#726 which performs a similar thing. As discussed in cartographer-project#613 (@cschuet  has already taken a look), I pulled this out of cartographer-project#481 (a really old PR whose merging has been postponed), which is an example where re-running optimization is triggered from elsewhere as well (besides from `ComputeConstraintsForNode`). This refactoring makes libcartographer friendlier for use cases such as that one.

An important detail is that I have changed the condition in `WaitForAllComputations` to also check if the work queue is empty. If there are other things on the worker queue besides `ComputeConstraintsForNode`, currently we will wrongfully conclude that all computations are done. (This detail was merged in cartographer-project#754, so it's no longer in the diff of this PR).

Also missing is the same thing for 3D. I can add that when we settle on this.

Also, I suggest that `run_loop_closure` gets renamed to `run_optimization`.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants