Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Coordinate transformation when use imu #19

Closed
goldenminerlmg opened this issue Sep 10, 2018 · 15 comments
Closed

Coordinate transformation when use imu #19

goldenminerlmg opened this issue Sep 10, 2018 · 15 comments
Labels
wontfix This will not be worked on

Comments

@goldenminerlmg
Copy link

Hi author, Thanks for your work!

  1. I am wonder how the coordinate transformation in point cloud distortion correction process. Did it firstly convert current point into the IMU coordinate(or world frame?), then integrate the measurements. Third calculated the relative pose and velocity (in world frame) between the current point and the first point . At last transform the relative measurements in Lidar coordinate, linear interpret and register every point in corresponding line? I don't know if I got the right order.
    2. I think before fusion these two kind of messages, it should firstly calibrate the sensor suite in temporal and spatial. I havn't seen any code to do this work both in LOAM and your project. Can you tell me what the influence of parameter of Rc and Pc (transformation between lidar and imu), or what should i be attention when use my own suite.
    Thank you very much!
@luhcforgh
Copy link

luhcforgh commented Oct 11, 2018

We are also having trouble with this. It would be good to confirm the input coordinates for all the sensors, what transforms are applied in the existing code, and how we can modify the code (or use /tf to transform it) to get good results.

We have successfully been running the 2017-06-08-15-52-45_3.bag dataset (available at https://drive.google.com/drive/folders/1hVpHJDgZ2x5l7D5PkwMV1fVaaAd6cJ5f, as provided by the authors) which includes IMU data. We ran "rostopic echo /imu/data" and found that gravity is included, and is oriented in the z-axis, as a POSITIVE value. The bagged data includes all three datatypes: quaternion, angular velocity, and linear acceleration.

In the linked YouTube video (https://www.youtube.com/watch?v=O3tz_ftHV48) we can see that the LIDAR is positioned with the cable towards the back on the Jackal base, for example at this frame: https://youtu.be/O3tz_ftHV48?t=10.

When we view the PointCloud2 topic from the LIDAR, we get positive X forwards (positive in the direction of the Velodyne sticker, negative along the cable), Y to the left (positive), and Z up (positive). Perhaps this means that we need to create a transform that flips the IMU data, and potentially a twist as well... We'll keep you posted.


PS: We also get "[ WARN][timestamp]: MSG TO TF: Quaternion Not Properly Normalized", which we find surprising, since the quaternion vector (x, y, z, w) has an Euclidean length of 1... EDIT: When this warning is created, a normalization function is called - so this should not cause any issues. It seems like it is merely a matter of computational losses in accuracy.

@luhcforgh
Copy link

We have kept on working, but have still not solved the issues yet. First off, we realized that simply creating a static transform is not enough, since LeGO-LOAM does not take this into account. Instead, we found multiple re-maps of coordinate data (e.g. x -> y). Our IMU is mounted right on top of the LIDAR for testing purposes, so that pure re-maps can put the data in the right order.

Like previously mentioned, we know that gravity should be positive on the Z-axis of the /imu/data topic. This really just leaves us with an issue of assigning the X- or Y-axis as the forward direction, as long as we are purely rotating the IMU data to fit the same axis orientation. A subscriber/publisher node is used to easily re-map our axes as needed (see http://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber%28c%2B%2B%29 for reference).

We have also "figured out" that we should be looking at the /tf transform aft_mapped for our current location - in case this was not obvious to any other implementers. This seems to work fine when using just the LIDAR, but we have to run some proper tests to verify the performance and distance accuracy.

Our team has reached a break point in the academic semester and will take some time off to study for midterms. We will be fully up and running again on the 30th.

@goldenminerlmg
Copy link
Author

Hi @luhcforgh and @author
Did you test the algorithm on Kitti dataset? I am not sure if it suite for Lego-LOAM to use the imu data because the imu in Kitti is only 10Hz. I am not sure if this frequency is too low.
By the way, how is going about your research, I want to exchange with you about how can the imu make a difference.

We have kept on working, but have still not solved the issues yet. First off, we realized that simply creating a static transform is not enough, since LeGO-LOAM does not take this into account. Instead, we found multiple re-maps of coordinate data (e.g. x -> y). Our IMU is mounted right on top of the LIDAR for testing purposes, so that pure re-maps can put the data in the right order.

Like previously mentioned, we know that gravity should be positive on the Z-axis of the /imu/data topic. This really just leaves us with an issue of assigning the X- or Y-axis as the forward direction, as long as we are purely rotating the IMU data to fit the same axis orientation. A subscriber/publisher node is used to easily re-map our axes as needed (see http://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber%28c%2B%2B%29 for reference).

We have also "figured out" that we should be looking at the /tf transform aft_mapped for our current location - in case this was not obvious to any other implementers. This seems to work fine when using just the LIDAR, but we have to run some proper tests to verify the performance and distance accuracy.

Our team has reached a break point in the academic semester and will take some time off to study for midterms. We will be fully up and running again on the 30th.

@zorosmith
Copy link

Thank you all for sharing your knowledge!
I'm also trying to run LeGO-loam with IMU.
I'll share my experience, If I have any breakthrough.

@zorosmith
Copy link

Hi, @luhcforgh and @goldenminerlmg .
I obtain some knowledge about the coordinate of the imu and lidar.
I play the rosbag provided by the author and watch the TF, as you can see in the following picture.
velodyne-imu
In the picture I can conclusion that the coordinate of lidar and imu are aligned exactly. With x forward, y to left and z up.
At the same time, I echo the /tf between velodyne and imu_link
rosrun tf tf_echo velodyne imu_link
and get the output like this:

  • Translation: [-0.070, -0.030, -0.364]
  • Rotation: in Quaternion [0.000, 0.000, 0.000, 1.000]
    in RPY (radian) [0.000, -0.000, 0.000]
    in RPY (degree) [0.000, -0.000, 0.000]
    It verifies my conclusion.

I also get the top view of the TF and the system suit picture from the paper.
lidar-near-front
jackal-label
From the 1st picture, I can see that the x-axis of lidar point to the front wheel.
From the 2nd picture, I can see that the logo of lidar is facing the bule coordinate.
However, I get the coordinate of my velodyne-VLP16 from the manual in the following picture.
velodyne
This makes me somewhat confused. I have 2 guess:
a. Does the author re-map the coordinate of the lidar in the code(x->-y',y->x',z->z', xyz is my lidar's coordinate)?
b. Is the coordinate of my lidar is different from yours?

Could you help me? Any help will be appreciated!
Thank you!

@zorosmith
Copy link

Hi @luhcforgh !
I also find there are many re-maps in the featureAssociation.cpp, like these functions:
void imuHandler(const sensor_msgs::Imu::ConstPtr& imuIn)
void adjustOutlierCloud()
void adjustDistortion()
void updateInitialGuess()
I failed to figure out the reason to do these operations.
Do you have any idea?
Any help will be appreciated! Thank you!

@mbudris
Copy link

mbudris commented Apr 4, 2019

The frame axis in VLP-16 documentation is not coincident with REP-105, but velodyne driver outputs correct ENU orientation.
refer to:
ros-drivers/velodyne#71

@annt3k
Copy link

annt3k commented Apr 9, 2019

Hi @luhcforgh !
I also find there are many re-maps in the featureAssociation.cpp, like these functions:
void imuHandler(const sensor_msgs::Imu::ConstPtr& imuIn)
void adjustOutlierCloud()
void adjustDistortion()
void updateInitialGuess()
I failed to figure out the reason to do these operations.
Do you have any idea?
Any help will be appreciated! Thank you!

Hi @zorosmith
I think the reason for re-mapping y->x, z->y and x->z is because of both LeGO-LOAM and LOAM is based on package loam-continuous which you can find the source code at
https://github.com/daobilige-su/loam_continuous
In loam-continuous, the lidar is a 2D lidar which is mounted vertically and rotated.
To use horizontally mounted velodyne lidar, I think the author has re-mapped y->x, z->y and x->z in both lidar and IMU data to use the legacy loam-continuous code.
I may be wrong but how do you think?

@LiShuaixin
Copy link

Hi @goldenminerlmg, I have some thoughts on your questions.

For the first one, the correction of point cloud, the exact scanning time of each point in current scan should be determined first. Then, the front and back imu data can be found in accordance with the time determined above. After that, it interpolate the sensor pose and velocity when the point is scanned. With the pose and velocity, the point in lidar frame can be transformed into the world frame, and then the lidar frame of start the scan. Therefore, all points in current scan are transformed to the same frame, which is the lidar frame when the scan start.

For the second question, I agree with your opinion that the extrinsic params should be calibrated first. While in both loam and lego-loam, they ignore them. But I'm not sure whether it will affect the result if extrinsic params are pure translation. It still need to be discussed further!

@zorosmith
Copy link

Hello @LiShuaixin ,

With the pose and velocity, the point in lidar frame can be transformed into the world frame, and then the lidar frame of start the scan.

This step make me confused. Why the point in lidar frame will be transformed int o the world frame first? You mean that the pose and velocity obtained by IMU are in the world frame.

Finally, I recommend a paper for you @goldenminerlmg , @LiShuaixin .
ICRA2018, 3D Lidar-IMU Calibration based on Upsampled Preintegrated Measurements for Motion Distortion Correction
I think it will help, however, I am not able to make sense it yet.

@LiShuaixin
Copy link

Hi @zorosmith, you can check the function 'TransformToStartIMU' in featureAssociation.cpp to understand the operation. Since the real lidar poses are varied during the period of a complete scan(lidar is moving), the reference frame of each scan point is different. IMU shift accumulation which is a part of IMU data integration is a representation of IMU translation in world frame. Therefore, we need to transform it into the world frame first and then to the same lidar frame.

But it's weird that the IMU shift accumulation is not used in this function. That's why I said I'm not sure whether it will affect the result if extrinsic params are pure translation (as the author's introduction in his paper, lidar and imu are mounted without rotation)

@zorosmith
Copy link

Hi @luhcforgh !
I also find there are many re-maps in the featureAssociation.cpp, like these functions:
void imuHandler(const sensor_msgs::Imu::ConstPtr& imuIn)
void adjustOutlierCloud()
void adjustDistortion()
void updateInitialGuess()
I failed to figure out the reason to do these operations.
Do you have any idea?
Any help will be appreciated! Thank you!

Hi @zorosmith
I think the reason for re-mapping y->x, z->y and x->z is because of both LeGO-LOAM and LOAM is based on package loam-continuous which you can find the source code at
https://github.com/daobilige-su/loam_continuous
In loam-continuous, the lidar is a 2D lidar which is mounted vertically and rotated.
To use horizontally mounted velodyne lidar, I think the author has re-mapped y->x, z->y and x->z in both lidar and IMU data to use the legacy loam-continuous code.
I may be wrong but how do you think?

Hello @annt3k , sorry to reply late.
I never read the code of the loam-continuous, so I can not judge whether your answer is right or not.
Anywhere, thank you for your reply!
I will spend some time in checking what you say these day.

@zorosmith
Copy link

Hello @LiShuaixin , you are right!
With your guidance, I finally make sense the TransformToStartIMU function!
It seems that the author only consider to correct the rotation distortion.
Maybe the effect from translation distortion is limitation...
Could you leave your email for me? I hope to have more discussion with you!
Thank you!

@LiShuaixin
Copy link

@zorosmith you can email me at shuaixinli_md@126.com

@stale
Copy link

stale bot commented Aug 30, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix This will not be worked on label Aug 30, 2019
@stale stale bot closed this as completed Sep 6, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

6 participants