This repo provides a lightweight environment definition for the Franka Panda robot. Users can use this to collect kinesthetic demonstrations, replay action trajectories on the robot, and deploy visual imitation policies for testing. It is built upon Polymetis and the RealSense camera stack, and was used for the RB2 benchmarking project.
Modifications (suddhu):
playback.py
publishes EE pose, rgb, depth as ROS topics/franka/*
- Modified
camera.py
resolution --time
argument inrecord.py
- Added
environment.yml
for conda
Before running this code you must setup a Franka Panda and install polymetis. Note that we assume access to both a server computer (w/ real time patch) that runs the Polymetis server, and GPU enabled client computer for running franka_control
code. For this modification, please install ROS and rospy
to publish the realsense + pose topics during playback.py
While this codebase is quite flexible, we imagined a 3 stage pipeline for collecting demonstrations using it. These steps are outlined below:
- Use
record.py
to collect joint trajectories kinesthetically. For example,python record.py task_prefix --task pour --time 60
, will configure the robot to collect and save pouring demonstrations. - Use
python playback.py path/to/pouring_0.npz
to ingest trajectories (from the previous stage) and produce expert demonstrations without the human in the video frame. This is crucial in order to collect data with real environment dynamics. In this modification, we playback a single recording in loop while publishing ROS topics. - Finally, use
test_policy.py
to evaluate trained policies. See the robot_baselines repo for baseline policy implementations.
Note that each of these requires a Polymetis server executing before the code can run. You can accomplish this by running sh launch.sh
on the configured server machine.
If you find this useful please cite:
@inproceedings{dasari2021rb2,
title={RB2: Robotic Manipulation Benchmarking with a Twist},
author={Sudeep Dasari and Jianren Wang and Joyce Hong and Shikhar Bahl and Yixin Lin and Austin Wang and Abitha Thankaraj and Karanbir Chahal and Berk Calli and Saurabh Gupta and David Held and Lerrel Pinto and Deepak Pathak and Vikash Kumar and Abhinav Gupta},
year={2021},
eprint={2203.08098},
archivePrefix={arXiv},
primaryClass={cs.RO},
booktitle={NeurIPS 2021 Datasets and Benchmarks Track}
}