Skip to content

A robot motion planning simulator that can efficiently navigate partially observable environments using deep learning

Notifications You must be signed in to change notification settings

mitchmcdee/xirtam

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Xirtam

Xirtam is a 2.5D robot motion planning simulator. It allows both the training and simulation of motion planning. During the training phase, foot placement and world region images are generated and exported for later use to train TimTamNet.

TimTamNet

TimTamNet is a fully convolutional neural network designed to learn the mapping between foot placements and the hidden environment map. Significant effort was put into designing this network to be extremely small (933 learnable parameters) for efficient storage (130kB uncompressed) and inference (<1ms on laptop hardware).

Requirements

  • Python3.6
  • pip3 install -r requirements.txt

Running

Note: Additional settings can be found and manipulated in xirtam/core/settings.py!

For usage help:

python3 -m xirtam --help

To run a visual simulation:

python3 -m xirtam -s

To run a training simulation:

python3 -m xirtam -t

To generate a random map:

python3 -m xirtam -s -g

To split generated data into training and test sets (necessary for training):

python3 xirtam/neural/split_util.py -r path/to/robot_dir/

To utilise a trained TimTamNet model:

python3 -m xirtam -s -n path/to/model.hdf5

To train a TimTamNet model:

python3 xirtam/neural/train.py -r path/to/robot_dir/

To debug and view the results of a TimTamNet model:

jupyter notebook xirtam/neural/debug.ipynb

Thesis

Efficient Mapping in Partially Observable Environments using Deep Learning

About

A robot motion planning simulator that can efficiently navigate partially observable environments using deep learning

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published