-
Notifications
You must be signed in to change notification settings - Fork 448
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for external IMU and Fisheye cameras #14
Comments
I think the key point in your problem is to calibrate both the temporal and spatial extrinsic among IMU-LiDAR-Camera sensors, which will finally determine the overall performance of R3LIVE with your sensors setup. |
I have a similar setup and as ziv-lin comment is way easier if you get a I2C IMU. You can even get the same that you have in LIvox AVIA (BMI088)[ for example here] (https://wiki.seeedstudio.com/Grove-6-Axis_Accelerometer%26Gyroscope%28BMI088%29/) |
I think temporal and spatial extrinsics shouldn't be too difficult. Camera/IMU input is timestamped by host driver and Livox is synced with system time via PTP. I also have Camera->IMU extrinsics from factory and can perform Camera->LiDAR calibration at a slightly later date. FastLIO2 works with the current setup and approximate IMU->LiDAR extrinsics. My main issue is that r3live immediately diverges on my setup. I'm not sure why, but I think it could be one of the following:
|
I've fixed my problem by calibrating system extrinsics and transforming the incoming point cloud by the LiDAR->IMU extrinsics in the LiDAR front end. Thanks for the help! |
Hi, can I ask how you calibrated the Lidar-IMU extrinsics? Are you using a particular tool for this? |
@seajayshore I did a lidar to camera calibration and then used the realsense's factory extrinsics for the rest. You can use kalibr to get camera to imu extrinsics if you don't have factory imu to camera calibration. |
Hi @tlin442! I have the same configuration as you (Livox Mid-70 + Intel T265). I would like to know if you were able to get good results with this algorithm and how did you perform the lidar-camera calibration (since T265 has a fisheye lens). |
@jorgef1299, I am getting extremely good results (position drift of <1m over ~200-300m translation and rotation). You need to patch the VIO rectification to use the fisheye model, then you can use the kannala-brandt-4 parameters from realsense factory calibration to undistort and crop. My patches to get it working on my setup are on my fork of R3live. Camera->LiDAR calibration is done via https://github.com/hku-mars/livox_camera_calib, but I directly feed the undistorted image as an input use no distortion model during the calibration. |
Thank you @tlin442! |
Hi @tlin442! I was trying your fork of R3Live and I am facing some issues with the fisheye image of the Realsense T265. The camera-frame is always 2, but that only happens with this camera. I think it may be related to the image encoding being "mono8". Have you ever faced this issue? Do you do any preprocessing to the images coming from the camera? Also, I'm using the original image size (848x800px) instead of 1280x1024, is it ok? Thanks |
@jorgef1299 sounds like your lidar isn't time-stamping it's messages properly. Are you using pps or ptp with the livox? |
@tlin442 I'm using PTP with the livox. That part is working because I can see the Lidar mapping at the right of R3Live window. |
@jorgef1299 I directly use the mono8 image via /fisheye2/image_raw. Are you transforming the IMU frame correctly? |
hi,I am planning to use https://github.com/hku-mars/livox_camera_calib to calibrate extrinsic between lidar and a fisheye camera with 190° FOV and it‘s distortion model is kb4,could you please specify how do you patch the VIO rectification and process the image so that they can feeed in the algorithm with all distortion set to 0? If any used code is open source ,please post the link in the reply. : ) |
@gara-9527 Please see my fork at https://github.com/tlin442/r3live |
@tlin442 Thanks for your quick reply! |
Yes. I directly used the undistorted fisheye image with zero lens distortion as an input. |
Hello,I would like to ask if you have succeeded with mid-70? |
Hello, thank you very much for your work. When I use r3live, there is a lot of drift, especially at the corners. The straight line is not bad, but the same parameter configuration has achieved good results in fast-lio2.
The imu_tools used for the internal reference calibration of the imu, radar and imu use is your job:hku-mars/LiDAR_IMU_Init, the camera and the imu use kalibr calibration, and the reprojection error is about 1.5 pixels. The test results on fast-lio2 are better, and the test results on vins are average, but there are obvious drifts in r3live. |
Hi
First I'd like to thank you for the source code release, it appears to be running well with the provided datasets.
I'm now trying to run r3live using a Livox MID-70 and Realsense T265 (and its internal IMU).
Could you please let me know if the following is possible:
Also, is it possible to use this in pure localization mode - i.e. disable RGB map generation for real-time operation?
Many thanks!
The text was updated successfully, but these errors were encountered: