Skip to content

RakugenSon/Multi-modal-dataset-for-odometry-estimation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gazebo-based Synthetic Multi-modal Dataset

Citation of arXiv

Leyuan Sun, Guanqun Ding, Yue Qiu, Yusuke Yoshiyasu and Fumio Kanehiro, 2023. TransFusionOdom: Interpretable Transformer-based LiDAR-Inertial Fusion Odometry Estimation. PDF

Bibtex

 @misc{sun2023transfusionodom,
      title={TransFusionOdom: Interpretable Transformer-based LiDAR-Inertial Fusion Odometry Estimation}, 
      author={Leyuan Sun and Guanqun Ding and Yue Qiu and Yusuke Yoshiyasu and Fumio Kanehiro},
      year={2023},
      eprint={2304.07728},
      archivePrefix={arXiv},
      primaryClass={cs.RO}
}

Test platform: Ubuntu 20.04 + ros-noetic-desktop-full

leyuansun@outlook.com 2023.2

Contents:

1.Multi-modal dataset

​ Realsense D435: RGB + Depth

​ Velodyne VLP-16: 3D LiDAR point cloud

​ Inertial Measurement Unit (IMU): Linear acceleration + angular velocity.

​ Ground truth trajectory: timestamp + tx + ty + tz + qx + qy + qz + qw

Simulation in Gazebo with ROS, Follow below steps:

Step 1:

git clone this repository
cd neor_mini/mini_sim18_ws
rosdep install --from-paths src --ignore-src -r -y  
catkin_make                            

You can see 2 ROS packages in mini_sim18_ws/src folder,lists:

neor_mini                 # Storing the description of neor mini's appearance with urdf file
steer_mini_gazebo         # Storing the launch files of neor mini model visual in Gazebo

Step 2: launch UGV's launch file, visualize the urdf in Rviz.

# show the urdf of UGV in Rviz
# plz source devel/setup.bash firstly
cd ~/neor_mini/mini_sim18_ws/src/neor_mini/launch
roslaunch display_gazebo_sensors_VLP16.launch 

Step 3: Visualize the urdf in Gazebo and control UGV.

# show the urdf of UGV in Gazebo and control it using GUI
# plz source devel/setup.bash firstly
cd ~/neor_mini/mini_sim18_ws/src/steer_mini_gazebo/mini_gazebo/launch
roslaunch steer_mini_gazebo steer_mini_sim.launch

You can visualize different modalities and ground truth in Rviz by adding topics.

Step 4: Record all modalities and ground truth in the rosbag.

cd ~/neor_mini/mini_sim18_ws/src/steer_mini_gazebo/mini_gazebo/launch
roslaunch steer_mini_gazebo steer_mini_sim.launch
rosbag record -O bag_file_name /d435/color/image_raw /d435/depth/image_raw /imu/data /velodyne_points /ackermann_steering_controller/odom

The recorded bag file includes the following topics.

topics:      /ackermann_steering_controller/odom      : nav_msgs/Odometry # not visually drift       
             /d435/color/image_raw                    : sensor_msgs/Image      
             /d435/depth/image_raw                    : sensor_msgs/Image      
             /imu/data                                : sensor_msgs/Imu        
             /velodyne_points                         : sensor_msgs/PointCloud2

Step 5: Extract modalities from rosbag.

The format of file name is timestamp.timestr = "%.6f" % msg.header.stamp.to_sec()

python3 extractgt.py                      #extract ground truth trajectory, will upadte the gt extraction details from simualtion state later
python3 extractimg.py                     #extract RGB images
python3 extractdepth.py                   #extract 16UC1 depth images
python3 extractimu.py                     #extract angular velocity and linear acceleration in xyz
rosrun pcl_ros bag_to_pcd <*.bag> /velodyne_points <output_directory>  #extract LiDAR point cloud to pcd files

You can try to add other sensors and world maps(even dynamic environments) as extending of this repo.

Acknowledgement

Gazebo world maps:

Steer_drive_ros to control the motion of UGV

Velodyne VLP-16 3D LiDAR

Realsense D435 for RGB and depth

Framework and UGV model were modified from

Trajectory plot tool evo

Dataset example

You can download the rosbag here.

Using provided scripts to extract modalities you need, use evo to visulize the gt trajectory.

evo_traj tum gt.txt -p -v --plot_mode xy         #visualize the extracted ground truth trajectory

Depth + RGB + IMU + LiDAR point cloud + gt trajectory are inside of the rosbag.

The trajectory in the Gazebo world map.

The details of this dataset (including all recorded rosbags and code related to ROS/Gazebo) will be public after our submission get acceptance.

About

Public dataset as supplementary material for IEEE Sensors Journal submission

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages