Skip to content
/ GraspXL Public

This is a repository for GraspXL, which can generate objective-drive grasping motions for 500k+ objects with different dexterous hands.

License

Unknown, Unknown licenses found

Licenses found

Unknown
LICENSE.md
Unknown
COPYING
Notifications You must be signed in to change notification settings

zdchan/GraspXL

Repository files navigation

GraspXL: Generating Grasping Motions for Diverse Objects at Scale

Contents

  1. News
  2. Dataset
  3. Code
  4. Installation
  5. Demo
  6. Citation
  7. License

News

[2024.10] We released the script in URDF_gen_from_obj to pre-process new objects. Put the .obj files you want to grasp (make sure they have meaningful sizes for grasping) under URDF_gen_from_obj/temp and run urdf_gen.py, then it will generate a folder with the processed objects and the urdf files in rsc, which you can further utilize to generate grasping motions with any environment scripts.

[2024.08] Data example & viewer released!

[2024.08] Code released!

[2024.07] The large-scale generated motions for 500k+ objects, each with diverse objectives and currently MANO and Allegro hand models, are ready to download! If you are interested, just fill this form to get access!

We will continuously enrich the dataset (e.g., motions generated with more hand models, more grasping motions generated with different objectives, etc) and keep you updated!

[2024.03] The code will be released soon. Please fill out this form if you want to get the notification for any update!

Dataset

The dataset has been released, including the grasping motion sequences of different robot hands for 500k+ objects. Check docs/DATASET.md for details and instructions.

For an easier trial of the dataset, we give some examples (30 objects) of the data in the dataset_example subfolder.

We also provide a viewer for the grasping motions. Check GraspXL_visualization for more details.

For texture. We use decimated and texture-free Objaverse meshes in our dataset for smaller space consumption. However, the original Objaverse object ids are still included in the dataset (<object_id>). You can download the original Objaverse objects with textures according to their official download tutorial. The only thing to notice is the meshes we use in our dataset are scaled from original meshes, so you should calculate the scaling factor of each object by the bounding box size, and scale the downloaded original Objaverse mesh accordingly while keeping the textures. It should be quite convenient to be done with a Python script using trimesh or/and pymeshlab. After this, you can replace the objects in the dataset with the textured meshes for visualization.

Note The MANO hand poses in our dataset align with the original MANO model. Note that manopth and manotorch have different joint orders. For more details, check manotorch.

Code

The repository comes with all the features of the RaiSim physics simulation, as GraspXL is integrated into RaiSim.

The GraspXL related code can be found in the raisimGymTorch subfolder. There are 12 environments (see envs) for Allegro Hand ("allegro_"), Mano Hand ("ours_") and Shadow Hand ("shadow_"), 4 for each. "_fixed" and "_floating" represent the environments for the first and second training phase respectively. "_test" represents the test environments and contain different test scripts for different test sets (PartNet, ShapeNet, Objaverse, and Generated/Reconstructed objects). "_demo" represents the visualization environments which also record the generated motions.

Installation

For good practice for Python package management, it is recommended to use virtual environments (e.g., virtualenv or conda) to ensure packages from different projects do not interfere with each other. The code is tested under Python 3.8.10.

RaiSim setup

GraspXL is based on RaiSim simulation. For the installation of RaiSim, see and follow our documentation under docs/INSTALLATION.md. Note that you need to get a valid, free license for the RaiSim physics simulation and an activation key via this link.

GraspXL setup

After setting up RaiSim, the last part is to set up the GraspXL environments.

$ cd raisimGymTorch 
$ python setup.py develop

All the environments are run from this raisimGymTorch folder.

Note that every time you change the environment.hpp, you need to run python setup.py develop again to build the environments.

Then install pytorch with (Check your CUDA version and make sure they match)

$ pip3 install torch==2.3.0 torchvision==0.18.0 torchaudio==2.3.0 --index-url https://download.pytorch.org/whl/cu118

Install required packages

$ pip install scipy
$ pip install scikit-learn scipy matplotlib

Other alternative requirements

  1. (Only for Mano policy training) GraspXL uses manotorch Anatomy Loss during the training (for Mano Hand only), so if you want to train Mano Hand policies (run ours_fixed/runner.py or ours_floating/runner.py), you need to install manotorch. Please follow the official guideline in manotorch.

​ After installation, replace the mano_assets_root in mano_amano.py to your own path.

  1. (Only for ShapeNet test set) If you want to generate motions for the objects from the ShapeNet test set, download ShapeNet.zip, upzip and put the folder named large_scale_obj in rsc (The original object meshes are from ShapeNet)
  2. (Only for 500k+ Objaverse test set) If you want to generate motions for the 500k+ objaverse objects, fill this form to get access to objaverse_urdf.zip. Unzip it and put the subset you want in rsc.

You should be all set now. Try to run the demo!

Demo

We provide some pre-trained models to view the output of our method. They are stored in this folder.

  • For interactive visualizations, you need to run

    ./../raisimUnity/linux/raisimUnity.x86_64

    and check the Auto-connect option.

  • To randomly choose an object and visualize the generated sequences in simulation (use Mano Hand as an example), run

    python raisimGymTorch/env/envs/ours_demo/demo.py

You can indicate the objects or the objectives of the generated motions in the visualization environments

  • The object is by default a random object from the training set, which you can change to a specified object. You can specify the object set by the variable cat_name (e.g., for ours_demo), and choose a specific object by the variable obj_list (e.g., for ours_demo).

    The object sets include mixed_train (the training set from PartNet), affordance_level (the PartNet test set), large_scale_obj (the ShapeNet test set which you can download with ShapeNet.zip), YCB (reconstructed YCB objects), gt (groundtruth of the reconstructed YCB objects), wild (reconstructed in-the-wild objects), gen (objects generated with DreamFusion)

  • The objectives are by default randomly sampled with the function get_initial_pose. You can also specify a desired objective with the function get_initial_pose_set. ours_demo shows an example.

BibTeX Citation

To cite us, please use the following:

@inProceedings{zhang2024graspxl,
  title={{GraspXL}: Generating Grasping Motions for Diverse Objects at Scale},
  author={Zhang, Hui and Christen, Sammy and Fan, Zicong and Hilliges, Otmar and Song, Jie},
  booktitle={European Conference on Computer Vision (ECCV)},
  year={2024}
}

License

This work and the dataset are licensed under CC BY-NC 4.0.

About

This is a repository for GraspXL, which can generate objective-drive grasping motions for 500k+ objects with different dexterous hands.

Topics

Resources

License

Unknown, Unknown licenses found

Licenses found

Unknown
LICENSE.md
Unknown
COPYING

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published