Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for external IMU and Fisheye cameras #14

Closed
tlin442 opened this issue Jan 3, 2022 · 20 comments
Closed

Support for external IMU and Fisheye cameras #14

tlin442 opened this issue Jan 3, 2022 · 20 comments

Comments

@tlin442
Copy link

tlin442 commented Jan 3, 2022

Hi

First I'd like to thank you for the source code release, it appears to be running well with the provided datasets.
I'm now trying to run r3live using a Livox MID-70 and Realsense T265 (and its internal IMU).

Could you please let me know if the following is possible:

  • Custom IMU->LiDAR extrinsics. Due to packaging constraints, my LiDAR is not mounted aligned with the IMU.
  • Fisheye correction on input images from the T265.

Also, is it possible to use this in pure localization mode - i.e. disable RGB map generation for real-time operation?

Many thanks!

@ziv-lin
Copy link
Member

ziv-lin commented Jan 3, 2022

I think the key point in your problem is to calibrate both the temporal and spatial extrinsic among IMU-LiDAR-Camera sensors, which will finally determine the overall performance of R3LIVE with your sensors setup.

@Camilochiang
Copy link

Camilochiang commented Jan 3, 2022

I have a similar setup and as ziv-lin comment is way easier if you get a I2C IMU. You can even get the same that you have in LIvox AVIA (BMI088)[ for example here] (https://wiki.seeedstudio.com/Grove-6-Axis_Accelerometer%26Gyroscope%28BMI088%29/)

@tlin442
Copy link
Author

tlin442 commented Jan 3, 2022

I think temporal and spatial extrinsics shouldn't be too difficult. Camera/IMU input is timestamped by host driver and Livox is synced with system time via PTP. I also have Camera->IMU extrinsics from factory and can perform Camera->LiDAR calibration at a slightly later date. FastLIO2 works with the current setup and approximate IMU->LiDAR extrinsics.

My main issue is that r3live immediately diverges on my setup. I'm not sure why, but I think it could be one of the following:

  • IMU->LiDAR extrinsics being wrong. I am using imu_transformer to get the IMU samples in LiDAR frame
  • The cameras on the T265 being Mono Fisheye

@tlin442
Copy link
Author

tlin442 commented Jan 5, 2022

I've fixed my problem by calibrating system extrinsics and transforming the incoming point cloud by the LiDAR->IMU extrinsics in the LiDAR front end. Thanks for the help!

@tlin442 tlin442 closed this as completed Jan 5, 2022
@seajayshore
Copy link

Hi, can I ask how you calibrated the Lidar-IMU extrinsics? Are you using a particular tool for this?

@tlin442
Copy link
Author

tlin442 commented Feb 9, 2022

@seajayshore I did a lidar to camera calibration and then used the realsense's factory extrinsics for the rest. You can use kalibr to get camera to imu extrinsics if you don't have factory imu to camera calibration.

@jorgef1299
Copy link

Hi @tlin442! I have the same configuration as you (Livox Mid-70 + Intel T265). I would like to know if you were able to get good results with this algorithm and how did you perform the lidar-camera calibration (since T265 has a fisheye lens).
Thanks

@tlin442
Copy link
Author

tlin442 commented Apr 11, 2022

@jorgef1299, I am getting extremely good results (position drift of <1m over ~200-300m translation and rotation).

You need to patch the VIO rectification to use the fisheye model, then you can use the kannala-brandt-4 parameters from realsense factory calibration to undistort and crop. My patches to get it working on my setup are on my fork of R3live.

Camera->LiDAR calibration is done via https://github.com/hku-mars/livox_camera_calib, but I directly feed the undistorted image as an input use no distortion model during the calibration.

@jorgef1299
Copy link

Thank you @tlin442!

@jorgef1299
Copy link

Hi @tlin442! I was trying your fork of R3Live and I am facing some issues with the fisheye image of the Realsense T265. The camera-frame is always 2, but that only happens with this camera. I think it may be related to the image encoding being "mono8". Have you ever faced this issue? Do you do any preprocessing to the images coming from the camera?

Also, I'm using the original image size (848x800px) instead of 1280x1024, is it ok?

Thanks

image

@tlin442
Copy link
Author

tlin442 commented Apr 15, 2022

@jorgef1299 sounds like your lidar isn't time-stamping it's messages properly. Are you using pps or ptp with the livox?

@jorgef1299
Copy link

@tlin442 I'm using PTP with the livox. That part is working because I can see the Lidar mapping at the right of R3Live window.
Can I use directly the mono8 fisheye image of the T265 camera or does it require some processing?

@tlin442
Copy link
Author

tlin442 commented Apr 16, 2022

@jorgef1299 I directly use the mono8 image via /fisheye2/image_raw. Are you transforming the IMU frame correctly?

@kakghiroshi
Copy link

kakghiroshi commented Aug 25, 2022

@jorgef1299, I am getting extremely good results (position drift of <1m over ~200-300m translation and rotation).

You need to patch the VIO rectification to use the fisheye model, then you can use the kannala-brandt-4 parameters from realsense factory calibration to undistort and crop. My patches to get it working on my setup are on my fork of R3live.

Camera->LiDAR calibration is done via https://github.com/hku-mars/livox_camera_calib, but I directly feed the undistorted image as an input use no distortion model during the calibration.

hi,I am planning to use https://github.com/hku-mars/livox_camera_calib to calibrate extrinsic between lidar and a fisheye camera with 190° FOV and it‘s distortion model is kb4,could you please specify how do you patch the VIO rectification and process the image so that they can feeed in the algorithm with all distortion set to 0? If any used code is open source ,please post the link in the reply. : )

@tlin442
Copy link
Author

tlin442 commented Aug 25, 2022

@gara-9527

Please see my fork at https://github.com/tlin442/r3live

@kakghiroshi
Copy link

@tlin442 Thanks for your quick reply!
Actually,I'm confused about the part of lidar-camera calibration.I want to know how did you manged to use https://github.com/hku-mars/r3live/issues/url to calibrate the sensors.You mentioned that the image needs to be undistorted,which function do you use to achieve that? what I use is cv::fisheye::undistortImage.After undistortion and set the distort parameter to 0 ,can this algorithm work? Or is there anything else that needs to be changed? Please let me know.Many thanks again!

@tlin442
Copy link
Author

tlin442 commented Aug 31, 2022

@tlin442 Thanks for your quick reply! Actually,I'm confused about the part of lidar-camera calibration.I want to know how did you manged to use https://github.com/hku-mars/r3live/issues/url to calibrate the sensors.You mentioned that the image needs to be undistorted,which function do you use to achieve that? what I use is cv::fisheye::undistortImage.After undistortion and set the distort parameter to 0 ,can this algorithm work? Or is there anything else that needs to be changed? Please let me know.Many thanks again!

Yes. I directly used the undistorted fisheye image with zero lens distortion as an input.

@hr2894235132
Copy link

hr2894235132 commented Oct 27, 2022

I've fixed my problem by calibrating system extrinsics and transforming the incoming point cloud by the LiDAR->IMU extrinsics in the LiDAR front end. Thanks for the help!

Hi

First I'd like to thank you for the source code release, it appears to be running well with the provided datasets. I'm now trying to run r3live using a Livox MID-70 and Realsense T265 (and its internal IMU).

Could you please let me know if the following is possible:

  • Custom IMU->LiDAR extrinsics. Due to packaging constraints, my LiDAR is not mounted aligned with the IMU.
  • Fisheye correction on input images from the T265.

Also, is it possible to use this in pure localization mode - i.e. disable RGB map generation for real-time operation?

Many thanks!

Hello,I would like to ask if you have succeeded with mid-70?

@farhad-dalirani
Copy link

@tlin442 Hi,
I have a drifting problem, I explained my problem and setup with detail in this issue:
#157
It would be great if you look at it. I found you answers related to my problem. 👍

@fanshixiong
Copy link

Hello, thank you very much for your work. When I use r3live, there is a lot of drift, especially at the corners. The straight line is not bad, but the same parameter configuration has achieved good results in fast-lio2.
My r3live version is the version with external imu rotation added, the warehouse is here: @tiny442 https://github.com/tlin442/r3live
The lidar I use is livox mid-70 with external imu, mynt-D1000 camera and imu, and the modified livox driver.
My computer is ubuntu20.04.
Here is my config file:

Lidar_front_end:
   lidar_type: 1   # 1 for Livox-avia, 3 for Ouster-OS1-64
   N_SCANS: 6
   using_raw_point: 1
   point_step: 1
   lidar_imu_rotm:
      # LiDAR is mounted rotated by 90 deg
      #[1, 0, 0,
      # 0, 0, 1,
      # 0, -1, 0]
      [ 0.016511, -0.999700,  0.018083,
       0.057071,  0.018999,  0.998189,
       -0.998234, -0.015449,  0.057368]
   lidar_imu_tranm: 
      [0.039342, 0.077608, 0.037443]

r3live_common:
   if_dump_log: 0                   # If recording ESIKF update log. [default = 0]
   record_offline_map: 1            # If recording offline map. [default = 1]
   pub_pt_minimum_views: 3          # Publish points which have been render up to "pub_pt_minimum_views" time. [default = 3]
   minimum_pts_size: 0.01           # The minimum distance for every two points in Global map (unit in meter). [default = 0.01] 
   image_downsample_ratio: 1        # The downsample ratio of the input image. [default = 1]
   estimate_i2c_extrinsic: 1        # If enable estimate the extrinsic between camera and IMU. [default = 1] 
   estimate_intrinsic: 1            # If enable estimate the online intrinsic calibration of the camera lens. [default = 1] 
   maximum_vio_tracked_pts: 600     # The maximum points for tracking. [default = 600]
   append_global_map_point_step: 4  # The point step of append point to global map. [default = 4]

   res_path: "/home/frans/code/r3live_proj/catkin_ws_r3live/src/r3live_/res"

r3live_vio:
   image_width: 1280
   image_height: 720
   camera_intrinsic:
       [655.005, 0, 679.029,
       0, 656.097, 358.596,
      0, 0, 1]
   camera_dist_coeffs: [-0.238605, 0.0435143, 0.000366211, -0.00272751, 0]  #k1, k2, p1, p2, k3
   
   # Fine extrinsic value. form imu2camera calibration.
   camera_ext_R:
         [0.999998,  0.00183758, 0.000849753,
         0.00184018,   -0.999994, -0.00307635,
         0.000844095,  0.00307791,   -0.999995]
   camera_ext_t: [0.0993128, 0.0117891, -0.176605] 

   
r3live_lio:        
   lio_update_point_step: 4   # Point step used for LIO update.  
   max_iteration: 2           # Maximum times of LIO esikf.
   lidar_time_delay: -0.092132       # The time-offset between LiDAR and IMU, provided by user. 
   filter_size_corner: 0.30   
   filter_size_surf: 0.30
   filter_size_surf_z: 0.30
   filter_size_map: 0.30

The imu_tools used for the internal reference calibration of the imu, radar and imu use is your job:hku-mars/LiDAR_IMU_Init, the camera and the imu use kalibr calibration, and the reprojection error is about 1.5 pixels.
2023-03-30 16-56-58 的屏幕截图

The test results on fast-lio2 are better, and the test results on vins are average, but there are obvious drifts in r3live.
Is there any good solution to solve the drift? Thanks for the reply.
2023-03-30 16-47-49 的屏幕截图
2023-03-30 16-48-04 的屏幕截图
There is a large drift when facing the wall, but there is no drift in fast-lio2.
Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants