-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Coordinate transformation when use imu #19
Comments
We are also having trouble with this. It would be good to confirm the input coordinates for all the sensors, what transforms are applied in the existing code, and how we can modify the code (or use /tf to transform it) to get good results. We have successfully been running the 2017-06-08-15-52-45_3.bag dataset (available at https://drive.google.com/drive/folders/1hVpHJDgZ2x5l7D5PkwMV1fVaaAd6cJ5f, as provided by the authors) which includes IMU data. We ran "rostopic echo /imu/data" and found that gravity is included, and is oriented in the z-axis, as a POSITIVE value. The bagged data includes all three datatypes: quaternion, angular velocity, and linear acceleration. In the linked YouTube video (https://www.youtube.com/watch?v=O3tz_ftHV48) we can see that the LIDAR is positioned with the cable towards the back on the Jackal base, for example at this frame: https://youtu.be/O3tz_ftHV48?t=10. When we view the PointCloud2 topic from the LIDAR, we get positive X forwards (positive in the direction of the Velodyne sticker, negative along the cable), Y to the left (positive), and Z up (positive). Perhaps this means that we need to create a transform that flips the IMU data, and potentially a twist as well... We'll keep you posted. PS: We also get "[ WARN][timestamp]: MSG TO TF: Quaternion Not Properly Normalized", which we find surprising, since the quaternion vector (x, y, z, w) has an Euclidean length of 1... EDIT: When this warning is created, a normalization function is called - so this should not cause any issues. It seems like it is merely a matter of computational losses in accuracy. |
We have kept on working, but have still not solved the issues yet. First off, we realized that simply creating a static transform is not enough, since LeGO-LOAM does not take this into account. Instead, we found multiple re-maps of coordinate data (e.g. x -> y). Our IMU is mounted right on top of the LIDAR for testing purposes, so that pure re-maps can put the data in the right order. Like previously mentioned, we know that gravity should be positive on the Z-axis of the /imu/data topic. This really just leaves us with an issue of assigning the X- or Y-axis as the forward direction, as long as we are purely rotating the IMU data to fit the same axis orientation. A subscriber/publisher node is used to easily re-map our axes as needed (see http://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber%28c%2B%2B%29 for reference). We have also "figured out" that we should be looking at the /tf transform aft_mapped for our current location - in case this was not obvious to any other implementers. This seems to work fine when using just the LIDAR, but we have to run some proper tests to verify the performance and distance accuracy. Our team has reached a break point in the academic semester and will take some time off to study for midterms. We will be fully up and running again on the 30th. |
Hi @luhcforgh and @author
|
Thank you all for sharing your knowledge! |
Hi, @luhcforgh and @goldenminerlmg .
I also get the top view of the TF and the system suit picture from the paper. Could you help me? Any help will be appreciated! |
Hi @luhcforgh ! |
The frame axis in VLP-16 documentation is not coincident with REP-105, but velodyne driver outputs correct ENU orientation. |
Hi @zorosmith |
Hi @goldenminerlmg, I have some thoughts on your questions. For the first one, the correction of point cloud, the exact scanning time of each point in current scan should be determined first. Then, the front and back imu data can be found in accordance with the time determined above. After that, it interpolate the sensor pose and velocity when the point is scanned. With the pose and velocity, the point in lidar frame can be transformed into the world frame, and then the lidar frame of start the scan. Therefore, all points in current scan are transformed to the same frame, which is the lidar frame when the scan start. For the second question, I agree with your opinion that the extrinsic params should be calibrated first. While in both loam and lego-loam, they ignore them. But I'm not sure whether it will affect the result if extrinsic params are pure translation. It still need to be discussed further! |
Hello @LiShuaixin ,
This step make me confused. Why the point in lidar frame will be transformed int o the world frame first? You mean that the pose and velocity obtained by IMU are in the world frame. Finally, I recommend a paper for you @goldenminerlmg , @LiShuaixin . |
Hi @zorosmith, you can check the function 'TransformToStartIMU' in featureAssociation.cpp to understand the operation. Since the real lidar poses are varied during the period of a complete scan(lidar is moving), the reference frame of each scan point is different. IMU shift accumulation which is a part of IMU data integration is a representation of IMU translation in world frame. Therefore, we need to transform it into the world frame first and then to the same lidar frame. But it's weird that the IMU shift accumulation is not used in this function. That's why I said I'm not sure whether it will affect the result if extrinsic params are pure translation (as the author's introduction in his paper, lidar and imu are mounted without rotation) |
Hello @annt3k , sorry to reply late. |
Hello @LiShuaixin , you are right! |
@zorosmith you can email me at shuaixinli_md@126.com |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
Hi author, Thanks for your work!
2. I think before fusion these two kind of messages, it should firstly calibrate the sensor suite in temporal and spatial. I havn't seen any code to do this work both in LOAM and your project. Can you tell me what the influence of parameter of Rc and Pc (transformation between lidar and imu), or what should i be attention when use my own suite.
Thank you very much!
The text was updated successfully, but these errors were encountered: