Repository for the End-to-end Sketch-Guided Path Planning through Imitation Learning for Autonomous Mobile Robots paper.
It's recommended to work in the dockerized environment, however we do provide steps to reproduce locally. The last tested version worked on:
- Ubuntu 22
- Graphics Card: NVIDIA Quadro RTX 8000
- GPU driver version: 545.23.08
Clone the repo and download the expert dataset:
git clone https://github.com/charbel-a-hC/SKIPP.git
cd SKIPP/
- Everything should be installed using the Makefile
make env
You also need Tkinter
installed for the corresponding python3 version:
sudo apt-get install python3.9-tk
- Docker
- NVIDIA Container Toolkit for GPU support in docker
docker build -t skipp .
The dataset can be found on Huggingface, you can download the training or testing data by specifying the value for the argument data_type
which can be set to train
or test
:
python3 utils/download_dataset.py --data_type <train/test>
In the train
or test
datasets, there are 2 main directories according to the tasks L_Shape
and U_Shape
. In each task there are multiple sequences in different environments where each full sequence 0, 1, 2, ...
represents the full path of the AMR and the full traversal of that path.
egm
: Image(256x256x1)
of fused occupancy grid showing the current state of the robot with obstacles shown in white, navigation (free space) area in grey and unseen areas in black.path
: Binary image(256x256x1)
of the path which is taken by the AMR to reach the target.egm_goal_poses.npy
: Numpy array(M, 2)
withM
being the number of states in the sequence and the 3 dimensions showingx, y
in pixel coordinates of the target position the AMR should reach.
We provide pretrained weights for a model for each shape (U-shape and L-shape) also found on huggingface:
Running the demo.py
will automatically download the weights into the specified cache_dir
folder and will run a sample evaluation on the pretrained model:
poetry run python3 demo.py --model <skipp-u-shape/skipp-l-shape> --device <cuda,cpu>
docker run --rm --gpus all -it -v $(pwd):/SKIPP skipp "poetry run python3 demo.py --model <skipp-u-shape/skipp-l-shape>" --device <cuda,cpu>
You should get a sample output:
{'rmse': 0.004815616734656429, 'mean': 0.003488807276019792, 'median': 0.003108456509546494, 'std': 0.003319395777230308, 'min': 1.7932809118558974e-10, 'max': 0.011985940516812088, 'sse': 0.02316697437056794, 'average_inference': 1114.814043045044, 'batch_fid': 26.23838233947754}
and a visual showing the APE of the generated path:
Note: Before mounting the source code to the Docker container (as shown in the command below), make sure to update the
*.config.yaml
file located atSKIPP/configs
with the following required fields:
name: [your run name]
project: [your project name]
entity: [your team or user entity]
poetry run python3 train_bc.py --config configs/train_bc_sweep.config.yaml
docker run --rm --gpus all -it -v $(pwd):/SKIPP skipp "poetry run python3 train_bc.py --config configs/train_bc_sweep.config.yaml"