Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Coordinate system conversion #18

Closed
HomieRegina opened this issue Jul 11, 2022 · 15 comments
Closed

Coordinate system conversion #18

HomieRegina opened this issue Jul 11, 2022 · 15 comments

Comments

@HomieRegina
Copy link

HomieRegina commented Jul 11, 2022

Hi, thanks for your great work. I used my lidar and imu to record the data set and run it in DLO. The following figure is the map generated by the algorithm. I found that IMU drift is very serious. Is this related to coordinate system transformation? How can I set up and configure files to improve this situation?
d0724d53975aa58f749b1abfa692456
3ae5711ad4dc74f03c683af98bc7db9

@kennyjchen
Copy link
Collaborator

Does it work if IMU is disabled by setting imu: false? If so, then your IMU is probably flipped and providing DLO with a wrong scan-matching prior. The current implementation assumes that the LiDAR and IMU coordinate systems coincide, so try rearranging your sensors to align (preferably in ROS forward-left-up) if possible. Otherwise I can add support for extrinsics.

@HomieRegina
Copy link
Author

Thank you for your reply. When imu is disabled by setting, its map has not changed. I checked the defined lidar coordinate system and the defined IMU coordinate system. They are coincident. But I found that my lidar and imu still drifted seriously. The following two figures are the coordinate system of lidar and imu. Do I need to recalibrate the lidar and imu?

image

image

@kennyjchen
Copy link
Collaborator

Hmm... which sensors are you using? There was a similar issue a while back which was fixed through a configuration change, so try checking your driver settings. Otherwise if you record a bag for me I can help debug.

@HomieRegina
Copy link
Author

HomieRegina commented Jul 13, 2022

Well, thank you for your reply and I have solved some issues. The data set with serious drift before recorded all topics of IMU and lidar. When I recorded the topic of using only IMU acceleration data and lidar pointcloud data in the data set, the drift situation was basically solved. Could you please tell me whether the motion trajectory, attitude and other sensor information of IMU will affect the accuracy of mapping?

When the lidar system moves unsteadily on the carrier, the built map will have errors in the horizontal direction. As shown in the following figure. But this problem does not exist on the stable moving carrier. Here is the data set.Do I need to calibrate the random error of IMU?
image
By the way, is there any setting for the joint calibration of the coordinate system of lidar and IMU in the program?

@kennyjchen
Copy link
Collaborator

Interesting. It shouldn't; the IMU callback only pulls from the angular_velocity and linear_acceleration fields. Glad to hear you figured it out though.

DLO's world coordinate system depends on the initial position of the LiDAR, so if you start tilted the entire map will be tilted. I checked out your bag though and it looks like you disabled gravity alignment. Your data does better if you turn it on. Keep in mind that the gravity vector isn't fully observable (strictly speaking) so it's just an estimated procedure -- but we've seen pretty good results from our experience (with a good IMU).

Extrinsics isn't currently supported, but it's not as necessary as other approaches since we're just using a relative rotational prior for scan-matching.

@HomieRegina
Copy link
Author

HomieRegina commented Jul 14, 2022

Thank you for your reply and patience. I have solved the above problems.
By the way. How to generate .pcd format file? I use the official order to get the following information.
图片

@kennyjchen
Copy link
Collaborator

Happy to help. You need a space between the two arguments, and the last argument should be the parent folder for the file, i.e.

rosservice call /robot/dlo_map/save_pcd 0.5 ~/Downloads

will save the map as ~/Downloads/dlo_map.pcd.

@HomieRegina
Copy link
Author

HomieRegina commented Jul 15, 2022

Thanks for your great work and attentive reply. I have successfully got the .pcd file.

Now I found new problems. Here are some pictures that are the results of the data set I recorded in the outdoor environment. In Figure 2, it can be seen from the .pcd file that there is an obvious drift in the map. When the path of motion returns to the origin, you can clearly see the drift of the map (as shown in Figure 3). Have you ever encountered this situation when you were building a map? How did you solve it?

By the way, there are obvious pause (not jitter) when building the map. What is the reason for this?

图片
图片
图片

@kennyjchen
Copy link
Collaborator

I don't see any obvious drift with your map itself, but if you are referring to the map's tilt, make sure you start on flat ground as I mentioned previously (i.e., Z is axis-aligned to gravity). Your initial position looks like it has some positive roll and maybe some negative pitch. If you're using our gravity alignment procedure, the IMU and LiDAR needs to be rotationally-aligned on your platform (I can't tell if they are from your second reply).

Regarding the periodic pauses, our map publisher is on a one second ros::Timer, so RViz may slow down / delay as the map grows. This was done in the event of lost communication with our robots for SubT so that the robot could send the full map once it came within range again. I added an option in v1.4.2 to turn this off and only publish keyframes individually as a solution. Make sure to turn up the Decay Time if you want to see the full map.

@HomieRegina
Copy link
Author

Thank you for your patient reply. I think maybe my statement is not accurate enough. You can pay close attention to my latest reply (the reply with three photos) because this data set is a new recording package. My data set starts from the starting point and finally returns to the starting point. You could see in my last picture that there is a height difference between my starting position and my final position. This height difference is the problem I want to solve. Do you know the reason for the inaccurate vertical positioning?
And I'm sure I set gravityAlign:true and the IMU and lidar coordinate system are physically consistent, but there is no rotation alignment. Do I need to rotate and align the IMU and lidar? How to achieve rotation alignment? Can rotating alignment solve the problem of inaccurate vertical positioning? We look forward to your reply.

@kennyjchen
Copy link
Collaborator

Try playing around with the voxelization (i.e, turn off submap voxelization), the number of keyframes used in submapping (maybe increase it), and the maximum correspondence distances for S2S and S2M.

@HomieRegina
Copy link
Author

HomieRegina commented Jul 21, 2022

Hi, thank you for your good work and reply. I'm trying to adjust these parameters these days. I want to ask you something I'm not very clear about.

When I changed the parameters of submap voxeFilter, I found that the clarity of its mapping has changed. When I looked at the code, I found that this parameter is related to the keyframe of the point cloud. Is that right? But when I turned down this parameter, I found that the mapping was clearer. When I turned up this parameter, I found that the number of keyframes become less. So I want to ask, is this parameter a filter? Does it take the reciprocal when used? Or in other forms?

And when I adjusted the parameter of maximum correspondence distances, I knew nothing about the changes in the mapping process. But it is clear that the coordinate error of the map has changed a lot. I can't find its rule for the time being. I want to know what factors affect the mapping process by adjusting the maximum correspondence distances parameters of S2S and S2M in GICP.

Thank you for your answer. We look forward to your reply

@kennyjchen
Copy link
Collaborator

A voxel filter downsamples a pointcloud depending on the leaf size of each voxel. Think of this as the size of each "3D pixel," so the larger the number the bigger the voxel and vice versa. Then, for a leaf size of 0.1 (for example), there will be one point per 0.1 cubic meters in the cloud.

This can affect DLO's adaptive keyframing from our spaciousness metric, which computes the median point distance.

Maximum correspondence distance affects scan-matching and which pair of points to consider during optimization. In larger environments, a larger distance is generally good since points are probably more spread out. In smaller environments (like your dataset), bad correspondence matching can corrupt the optimization process and therefore the overall result. This holds true mainly for S2M; for S2S, the search is purely between two instantaneous scans. In that case, it's slightly dependent on walking speed and the rate of the LiDAR (but in general it's not as sensitive as S2M).

Here is a nice tutorial on what voxel filtering is, and you can read more here about correspondence-based registration.

@HomieRegina
Copy link
Author

HomieRegina commented Jul 22, 2022

Thanks for your reply. I will understand this part carefully.

I recorded a dataset in the business district and built a map. We took the escalator from the second floor of the mall to the first floor of the mall. After turning around on the first floor, I returned to the starting point of the second floor. The effect of the map is good.
image
image
image
image
You can see a slash across the second floor. Don't worry. This part is the escalator from the second floor to the third floor. As shown in the third figure.

@kennyjchen
Copy link
Collaborator

Nice! Thanks for sharing your results with us :^)

If there are no more questions, I'll be closing this thread. Feel free to reopen or create a new issue if you have any further questions or comments about our work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants