Skip to content

UT-Austin-RobIn/ScrewMimic

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

65 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ScrewMimic: Bimanual Imitation from Human Videos with Screw Space Projection

Arpit Bahety, Priyanka Mandikal Ben Abbatematteo, Roberto Martín-Martín

The University of Texas at Austin

Project Page | Arxiv | Video

ScrewMimic aims to enable robots to learn bimanual manipulation behaviors from human video demonstrations and fine-tune them through interaction in the real world. Inspired by seminal work in psychology and biomechanics, we propose modeling the interaction between the two hands as a serial kinematic linkage — as a screw motion, in particular.

Installation

You would need to setup Frankmocap (with the ego-centric model enabled). Refer to https://github.com/facebookresearch/frankmocap

git clone --single-branch --branch main https://github.com/UT-Austin-RobIn/ScrewMimic.git
cd ScrewMimic
conda create --name screwmimic python==3.10
conda activate screwmimic
pip install -r requirements.txt

Usage

The current codebase contains scripts to extract screw axes from a human RGBD video.

Run frankomcap on the human video

python -m demo.demo_handmocap --input_path {path_to_rgb_img_folder} --out_dir ./mocap_output/{folder_name}/ --view_type ego_centric --save_pred_pkl

Extract hand poses (remember to set the frankmocap output directory in the code)

python perception/extract_hand_poses.py --folder_name data/open_bottle_1

Obtain screw axis

python perception/extract_screw_action.py --folder_name data/open_bottle_1 --hand left

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages