Leyuan Sun, Guanqun Ding, Yue Qiu, Yusuke Yoshiyasu and Fumio Kanehiro, 2023. TransFusionOdom: Interpretable Transformer-based LiDAR-Inertial Fusion Odometry Estimation. PDF
@misc{sun2023transfusionodom,
title={TransFusionOdom: Interpretable Transformer-based LiDAR-Inertial Fusion Odometry Estimation},
author={Leyuan Sun and Guanqun Ding and Yue Qiu and Yusuke Yoshiyasu and Fumio Kanehiro},
year={2023},
eprint={2304.07728},
archivePrefix={arXiv},
primaryClass={cs.RO}
}
Test platform: Ubuntu 20.04 + ros-noetic-desktop-full
leyuansun@outlook.com 2023.2
Realsense D435: RGB + Depth
Velodyne VLP-16: 3D LiDAR point cloud
Inertial Measurement Unit (IMU): Linear acceleration + angular velocity.
Ground truth trajectory: timestamp + tx + ty + tz + qx + qy + qz + qw
Simulation in Gazebo with ROS, Follow below steps:
Step 1:
git clone this repository
cd neor_mini/mini_sim18_ws
rosdep install --from-paths src --ignore-src -r -y
catkin_make
You can see 2 ROS packages in mini_sim18_ws/src folder,lists:
neor_mini # Storing the description of neor mini's appearance with urdf file
steer_mini_gazebo # Storing the launch files of neor mini model visual in Gazebo
Step 2: launch UGV's launch file, visualize the urdf in Rviz.
# show the urdf of UGV in Rviz
# plz source devel/setup.bash firstly
cd ~/neor_mini/mini_sim18_ws/src/neor_mini/launch
roslaunch display_gazebo_sensors_VLP16.launch
Step 3: Visualize the urdf in Gazebo and control UGV.
# show the urdf of UGV in Gazebo and control it using GUI
# plz source devel/setup.bash firstly
cd ~/neor_mini/mini_sim18_ws/src/steer_mini_gazebo/mini_gazebo/launch
roslaunch steer_mini_gazebo steer_mini_sim.launch
You can visualize different modalities and ground truth in Rviz by adding topics.
Step 4: Record all modalities and ground truth in the rosbag.
cd ~/neor_mini/mini_sim18_ws/src/steer_mini_gazebo/mini_gazebo/launch
roslaunch steer_mini_gazebo steer_mini_sim.launch
rosbag record -O bag_file_name /d435/color/image_raw /d435/depth/image_raw /imu/data /velodyne_points /ackermann_steering_controller/odom
The recorded bag file includes the following topics.
topics: /ackermann_steering_controller/odom : nav_msgs/Odometry # not visually drift
/d435/color/image_raw : sensor_msgs/Image
/d435/depth/image_raw : sensor_msgs/Image
/imu/data : sensor_msgs/Imu
/velodyne_points : sensor_msgs/PointCloud2
Step 5: Extract modalities from rosbag.
The format of file name is timestamp.timestr = "%.6f" % msg.header.stamp.to_sec()
python3 extractgt.py #extract ground truth trajectory, will upadte the gt extraction details from simualtion state later
python3 extractimg.py #extract RGB images
python3 extractdepth.py #extract 16UC1 depth images
python3 extractimu.py #extract angular velocity and linear acceleration in xyz
rosrun pcl_ros bag_to_pcd <*.bag> /velodyne_points <output_directory> #extract LiDAR point cloud to pcd files
You can try to add other sensors and world maps(even dynamic environments) as extending of this repo.
Gazebo world maps:
- https://github.com/sychaichangkun/ROS-Academy-for-Beginners
- https://github.com/mlherd/Dataset-of-Gazebo-Worlds-Models-and-Maps
Steer_drive_ros to control the motion of UGV
Velodyne VLP-16 3D LiDAR
Realsense D435 for RGB and depth
Framework and UGV model were modified from
Trajectory plot tool evo
You can download the rosbag here.
Using provided scripts to extract modalities you need, use evo
to visulize the gt trajectory.
evo_traj tum gt.txt -p -v --plot_mode xy #visualize the extracted ground truth trajectory
Depth + RGB + IMU + LiDAR point cloud + gt trajectory are inside of the rosbag.
The trajectory in the Gazebo world map.
The details of this dataset (including all recorded rosbags and code related to ROS/Gazebo) will be public after our submission get acceptance.