Skip to content

[RA-L + IROS2024] Learning to place unseen objects stably using large-scale simulation

License

Notifications You must be signed in to change notification settings

gist-ailab/uop-net

Repository files navigation



Unseen Object Placement (UOP)

Sangjun Noh*   Raeyoung Kang*   Taewon Kim*   Seunghyeok Back   Seongho Bak   Kyoobin Lee

GIST AILAB
* These authors contributed equally to the paper   † Corresponding author

           

This repository contains official implementation of following paper:

Learning to Place Unseen Objects Stably using a Large-scale Simulation (IEEE Robotics and Automation Letters, RA-L + IROS 2024)

Object placement is a fundamental task for robots, yet it remains challenging for partially observed objects. Existing methods for object placement have limitations, such as the requirement for a complete 3D model of the object or the inability to handle complex shapes and novel objects that restrict the applicability of robots in the real world. Herein, we focus on addressing the U nseen O bject P lacement ( UOP ) problem. We tackled the UOP problem using two methods: (1) UOP-Sim, a large-scale dataset to accommodate various shapes and novel objects, and (2) UOP-Net, a point cloud segmentation-based approach that directly detects the most stable plane from partial point clouds. Our UOP approach enables robots to place objects stably, even when the object's shape and properties are not fully known, thus providing a promising solution for object placement in various environments. We verify our approach through simulation and real-world robot experiments, demonstrating state-of-the-art performance for placing single-view and partial objects.




Environment Setting


Fast view our overall pipeline



Download Data

Evaluation Data

  • The UOP-Sim contains 63 YCB object datas for evaluation with 100 partial sampled points on each objects. these evaluation set was used for test and evaluate.
  • You can run inference and evaluate code after download this data
  • UOP-Sim Evaluation data can be download this google drive link or run the 0.download_uop_sim_dataset.sh command.
sh ./example/0.download_uop_sim_dataset.sh
# output : uop_data_for_evaluation.zip 
Evaluation Data File tree
└── uop_data
    └── ycb
        ├── 002_master_chef_can
        │   ├── inspected_zaxis.pkl   # uopsim label(axis of placement)
        │   ├── mesh_watertight.ply   # watertight mesh
        │   ├── model.ttm             # scene model to evaluate in simulation
        │   └── partial
        │       ├── 0.pkl             # partial sampled point cloud
        │       ├── 1.pkl
        │       ├── ...
        │       └── 99.pkl
        ├── 003_cracker_box
        ├── ...
        └── 077_rubiks_cube
127 directories, 6489 files, 171.1MB

Whole Data

You can download whole UOP-Sim dataset here



Generate Data (optional)

If you want to generate UOP-Sim data yourself. Please follow the instruction in setups/data_generation.md



Inference and Evaluate

To place objects with placement modules; UOP(ours), RPF, CHSA, BBF

you should follow step by step instruction in setups/uopnet.md

or run the combined script below.

sh ./partial_evaluate.sh 'path/to/uop_data(ex.~/uop_data)' process_num(ex.16)

Result

After all processes, you can get below table.

All metrics are the results of measuring the object's movement until it stops after placeing it at the table (in simulation).

-----------------------------------------------------------------
Module           | UOP   | RPF   | CHSA  | BBF   
rotation(deg)    | 3.98  | 26.13 | 39.28 | 45.05 
translation(cm)  | 0.39  | 3.24  | 5.78  | 6.44  
l2norm           | 0.11  | 0.67  | 1.01  | 1.15  
Success(<10deg)  | 73.32 | 62.37 | 42.06 | 30.03 
Success(/infer)  | 90.08 | 62.37 | 42.06 | 30.03 
-----------------------------------------------------------------
  • rotation : rotation of object
  • translation : translation of object
  • l2norm : transform matrix differnce(l2norm)
  • Success(<10deg): success rate of placement for all trial, rotation error lower than 10 deg.
  • Success(/infer): success rate of placement for inferenced trial, rotation error lower than 10 deg.

Visualize inference results

You can visualize inference result of each module with matplotlib

python example/visualize_inference_result.py --exp_file path/to/inference_result/endwith.pkl --module uop
  • --exp_file : pkl file of inference result of each module, after inference the result saved at each object directory(ex. ~/uop_data/ycb/002_master_chef_can/partial_eval/uop/0.pkl)
  • --module : module name of each placement module
    • uop : (ours)
    • trimesh : Convex Hull Stability Analysis(CHSA)
    • primitive : Bonding Box Fitting(BBF)
    • ransac : Ransac Plane Fitting(RPF)


Training (optional)

We propose pretrained model weight inside our repository.

If you want to training yourself please follow the instruction here



Inferences Result

  • This is the UOP-Net and other methods Inferences images. Partial points are observed like follow Partial View Generation

Sample Inferences Result Visualization
 Object  Whole Points Stable Label Partial Points
Inference Result
  
Partial Points  UOP-Net  RPF CHSA BBF


Partial View Generation

  • This is visualization of our partial points generation sequence(gif image) and inference results at each partial points

Partial View Points & Inference Result
Partial UOP-Net  RPF   CHSA   BBF 




Citation

@article{noh2023learning,
  title={Learning to Place Unseen Objects Stably using a Large-scale Simulation}, 
  author={Noh, Sangjun and Kang, Raeyoung and Kim, Taewon and Back, Seunghyeok and Bak, Seongho and Lee, Kyoobin},
  journal={IEEE Robotics and Automation Letters}, 
  year={2024},
  volume={},
  number={},
  pages={1-8},
}

License

See LICENSE

About

[RA-L + IROS2024] Learning to place unseen objects stably using large-scale simulation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •