Simplified docker version for the implementation of our 6-DOF Egomotion Estimation method via a single-chip mmWave radar (TI AWR1843) and a commercial-grade IMU. Our method is the first-of-its-kind DNN based odometry approach that can estimate the egomotion from the sparse and noisy data returned by a single-chip mmWave radar.
milliEgo: Single-chip mmWave Aided Egomotion Estimation with Deep Sensor Fusion
Chris Xiaoxuan Lu, Muhamad Risqi U. Saputra, Peijun Zhao, Yasin Almalioglu, Pedro P. B. de Gusmao, Changhao Chen, Ke Sun, Niki Trigoni, Andrew Markham
In SenSys 2020.
- TensorFlow 2.X (tested in '2.9.0-dev20220322')
- PyYAML
- h5py
- After git clone this repository, enter the project directory,
mkdir -p models/cross-mio
- Download the trained milliEgo model '18' from here.
- Unzip and put it in
./models/cross-mio/
.
- To train and test a model, please download our dataset from here (dropbox link).
- Specify the path of the dataset in "multimodal_data_dir", and a target path to store tf.data.Dataset in "tf_data_dir" in config.yaml
- Run dataset_convert.py to convert the training set to tf.data.Dataset, will be stored in the path specified in "tf_data_dir" in config.yaml
python test.py
python tf_train.py
If you find this useful for your research, please use the following.
@inproceedings{lu2020milliego,
title={milliEgo: single-chip mmWave radar aided egomotion estimation via deep sensor fusion},
author={Lu, Chris Xiaoxuan and Saputra, Muhamad Risqi U and Zhao, Peijun and Almalioglu, Yasin and de Gusmao, Pedro PB and Chen, Changhao and Sun, Ke and Trigoni, Niki and Markham, Andrew},
booktitle={Proceedings of the 18th Conference on Embedded Networked Sensor Systems (SenSys)},
year={2020}
}