Skip to content

earbouch/universal_manipulation_interface

 
 

Repository files navigation

Universal Manipulation Interface

[Project page] [Paper] [Hardware Guide] [Data Collection Instruction] [SLAM repo] [SLAM docker]

Cheng Chi1,2, Zhenjia Xu1,2, Chuer Pan1, Eric Cousineau3, Benjamin Burchfiel3, Siyuan Feng3,

Russ Tedrake3, Shuran Song1,2

1Stanford University, 2Columbia University, 3Toyota Research Institute

🛠️ Installation

Only tested on Ubuntu 22.04

Install docker following the official documentation and finish linux-postinstall.

Install system-level dependencies:

$ sudo apt install -y libosmesa6-dev libgl1-mesa-glx libglfw3 patchelf

We recommend Miniforge instead of the standard anaconda distribution for faster installation:

$ mamba env create -f conda_environment.yaml

Activate environment

$ conda activate umi
(umi)$ 

Running UMI SLAM pipeline

Download example data

(umi)$ wget --recursive --no-parent --no-host-directories --cut-dirs=2 --relative --reject="index.html*" https://real.stanford.edu/umi/data/example_demo_session/

Run SLAM pipeline

(umi)$ python run_slam_pipeline.py example_demo_session

...
Found following cameras:
camera_serial
C3441328164125    5
Name: count, dtype: int64
Assigned camera_idx: right=0; left=1; non_gripper=2,3...
             camera_serial  gripper_hw_idx                                     example_vid
camera_idx                                                                                
0           C3441328164125               0  demo_C3441328164125_2024.01.10_10.57.34.882133
99% of raw data are used.
defaultdict(<function main.<locals>.<lambda> at 0x7f471feb2310>, {})
n_dropped_demos 0

For this dataset, 99% of the data are useable (successful SLAM), with 0 demonstrations dropped. If your dataset has a low SLAM success rate, double check if you carefully followed our data collection instruction.

Despite our significant effort on robustness improvement, OBR_SLAM3 is still the most fragile part of UMI pipeline. If you are an expert in SLAM, please consider contributing to our fork of OBR_SLAM3 which is specifically optimized for UMI workflow.

Generate dataset for training.

(umi)$ python scripts_slam_pipeline/07_generate_replay_buffer.py -o example_demo_session/dataset.zarr.zip example_demo_session

Training Diffusion Policy

Single-GPU training. Tested to work on RTX3090 24GB.

(umi)$ python train.py --config-name=train_diffusion_unet_timm_umi_workspace task.dataset_path=example_demo_session/dataset.zarr.zip

Multi-GPU training.

(umi)$ accelerate --num_processes <ngpus> train.py --config-name=train_diffusion_unet_timm_umi_workspace task.dataset_path=example_demo_session/dataset.zarr.zip

🚧 More Detailed Documentation Coming Soon! 🚧

About

Universal Manipulation Interface: In-The-Wild Robot Teaching Without In-The-Wild Robots

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.9%
  • Lua 0.1%