-
Notifications
You must be signed in to change notification settings - Fork 448
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transformation matrix #16
Comments
You can get the detail about the relation among sensors by referring to our paper, which have given the detailed definitions of these sensor frames. |
Yes @ziv-lin I read details regarding hardware setup in your paper. But I am not able to find which transformation matrix is being used? Camera to lidar or lidar to camera |
Hello, I read your paper every transformation is wrt to imu frame. I have a calibration matrix between lidar and RGB. But not with imu are you using some static matrix between lidar and imu because there is no such information in the config file. I am using livox avia with your custom driver. If you are using such a matrix I can use that to get transformation between RGB and IMU. Thank you |
Hello @ziv-lin , I think calibration is good enough because I am using the same matrix in fastlio2 (modified) for coloring point cloud and doing a decent job. But I think the way I am putting the matrix there is some with that. Is there any repo or something you can suggest for imu and rgb camera? |
For calibrating the intrinsic of the camera, you can use the tools provided by OpenCV or Matlab. And to perform the LiDAR-camera calibration, I recommend you to use this repo: https://github.com/hku-mars/livox_camera_calib |
@ziv-lin I am using the same repo. But I think it is calibration lidar and RGB not imu and rgb. Do I have to matrix from this repo directly in the config file? |
How about the color of the point cloud in a static environment (i.e., don't move the sensors), is that correct? |
@ziv-lin I tried the in a static environment and it looks better but there is still some offset. The following is the output The objects are painted on the wall in real they are on a table. Looks like there is some static offset |
In our work, we treat the LiDAR frame and IMU (the LiDAR built-in IMU) as the same in our implementation. By this, the extrinsic between camera and IMU is the same as the extrinsic as extrinsic of camera-LiDAR. |
Is this the output of r3live or others? |
Its from r3live |
Okay great. Then I can use the matrix directly. I am attaching an image from fastlio coloring just to make sure the calibration mat is correct. |
@ziv-lin Sorry to bother but there is one more issue. Sometimes I am getting the following error ( Same config file and same bag file) |
It seems that your configuration is correct. What is the problem about R3LIVE? |
The color is not correct. There is an offset in the above sample it looks as if the objects are pasted on a wall. But it is not like that. And such color is coming only if the scene is static. If I am moving the setup the color is messed up fully |
Oh, sorry, my mistake... What is your image resolution? |
its 1920 x 1080 |
Might be I got the problem you meet with, the default image resolution I used is 1280 X 1024 and I should open this configuration for you to config. |
Okay thanks a lot @ziv-lin |
Can you replace all 1280 in codes to 1920 to see is that OK? I will commit a hotfix in tonight or tomorrow. |
Okay sure, I will try and update. |
r3live/r3live/src/r3live_vio.cpp Line 381 in a5a4d84
r3live/r3live/src/r3live_vio.cpp Line 387 in a5a4d84
r3live/r3live/src/r3live_vio.cpp Line 391 in a5a4d84
r3live/r3live/src/r3live_vio.cpp Line 397 in a5a4d84
r3live/r3live/src/r3live_vio.cpp Line 1069 in a5a4d84
|
Thank you for you reporting such issues~, this actually a bug. |
Hey @ziv-lin, I updated the image resolution. Thanks a lot for a quick fix. It improved the coloring but there is still significant bleeding. It looks like a calibration issue but I am not sure is it related to image size or aspect ratio? |
Can you try and move the sensors to the mapping results? Our algorithm can online calibrate the extrinsic to make it more correct. |
Sorry, my bad, you should also change the image_heigh 1024 to 1080, e.g., in the following codes: r3live/r3live/src/r3live_vio.cpp Line 387 in a5a4d84
r3live/r3live/src/r3live_vio.cpp Line 397 in a5a4d84
r3live/r3live/src/r3live_vio.cpp Line 1069 in a5a4d84
|
@ziv-lin I changed height and width as you suggested. But the issue is still the same image color is bleeding near the edges. And one more thing I am still getting the following error sometimes. Whenever I get this error I need to rerun the r3live node. |
Can you share your data with me,include both your rosbag files and your configurations? |
I can not open the folder, is there any problem? |
@ziv-lin I am able to open in incognito mode also. Should I upload somewhere else then google drive? |
I can download the config files, but can't get the rosbag file, can you set put them together in the same directory? |
Done. Please check and let me know |
That right now, I will try your bag tonght if possible. |
Take a glance look, it seems that your result is correct? |
Sorry, I upload rosbag and camera param(intrinsic and extrinsic)again in one link . I noticed that the resolution is in your work is 1280 X 1024 . However my camera in this test is 2048 X 3072, Which config file should I modify in R3Live? Change all of 1280 X 1024 to 2048 X 3072 ? |
Yes, you should change all 1280 -> 2048, and 1024 -> 3072 (why is your height larger than width? is there any wrong?) |
Sorry, my bad,Resolution is actually 2048 X 3072 (height *width) . |
@ziv-lin I think the issue is that this: r3live/r3live/src/r3live_vio.cpp Lines 381 to 384 in 28f5365
Malforms the camera calibration matrix here when the camera width isn't 1280 pixels. r3live/r3live/src/r3live_vio.cpp Lines 195 to 196 in 28f5365
I had the same issue undistorting my input with a 848*800 camera. |
I am now fixing and testing this problem, please wait with patience. |
@aditdoshi333 It runs your data carefully, it seems that your problem is your calibration is not accurate enough, causing the color rendering is not so accurate. In addition, the print of ""****** Remove_outlier_using_ransac_pnp error*****"" is caused by the delayed incoming image message (see following figure). You can ignore this print since it didn't play too much effect after R3LIVE has received enough image frames. If your play the bag with ''-s 5'', this warning disappear:
|
Hi~, @aditdoshi333 @jxx315 I have just pushed a commit that fixes this bug, now R3LIVE allows you to set the image resolution correctly, can you have a try of this version? Please let me know if you found any bugs and problems. |
You can now set your own image resolution by modifying these two configs: r3live/config/r3live_config.yaml Line 19 in 4d386cc
r3live/config/r3live_config.yaml Line 20 in 4d386cc
|
@ziv-lin Thanks for the quick fix. I will run the new commit and update you in the next 2 days. And I will also check calibration at my end. Thanks for all the efforts. One more doubt the calibration between camera and imu is also for texture mapping or it will used in mapping also. I mean preciseness of calibration is it proportional to the accuracy of mapping? Thank you |
Hei @aditdoshi333 . In my experience a more precise calibration will give you a better mapping. See here for example (Both pictures are a top view): Not so good calibration: You can clearly see that the better the calibration, the better is the alignment of the incoming scans (white points). This is of course as the intrinsic and extrinsic parameters play an important role to say from were the points are coming. I have to do a better calibration in any case, you can see than in my "better calibration" is not perfect |
great! |
Hi! @aditdoshi333 May I ask you how to add the camera rgb information to the lidar in the fast_lio, I meet some problems. |
Hello @ziv-lin , |
Hello, You can change the struct of point cloud and add rgb. I color the incoming point cloud before mapping. I am using the livox calibration code for coloring every frame. |
Thanks very much! I will try! |
@aditdoshi333 I am so sorry to hear that. Please take good care of your body. Hoping every goes well. |
Hello @ziv-lin, I can confirm that the color offset is because of poor calibration. Sorry for the trouble. |
Hi,I did same work as you did。Did you do the time sync between lidar and camera?I mean hard sync,cuz my result is not good,I don't have accurate color with pointcloud. |
Hello,
Thanks for sharing the code. In the config YAML file, there is extrinsic between camera and lidar, I want to know that the matrix is the camera to lidar or lidar to the camera?
And is there any reference for fine calibration between camera and lidar?
Thank you
The text was updated successfully, but these errors were encountered: