Skip to content

This repository is part of the supplementary materials for the ICIP 2024 submission titled "VR-based generation of photorealistic synthetic data for training hand-object tracking models". It provides several helpful scripts for rendering high-quality synthetic sequences of Hand-Object Interactions performed in VR using Blender.

License

wetoo-cando/blender-hoisynth

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This repository is part of the supplementary materials for the ICIP 2024 submission titled "VR-based generation of photorealistic synthetic data for training hand-object tracking models". It provides several helpful scripts for rendering high-quality synthetic sequences of Hand-Object Interactions performed in VR using Blender. It also provides scripts for export data in the DexYCB format.

Paper on ArXiv

The original Blender files are currently only provided to ICIP reviewers for the peer-review process. These materials will be released conditioned on the acceptance of the paper for publication.

Please appropriately reference the following paper in any publication making use of the Software. Citation: [...]

Qualitative comparison of sample blender-hoisynth images (left) with real DexYCB images (right).



Your GIF Description

Your GIF Description

Your GIF Description

Your GIF Description

Sample HOI videos generated using blender-hoisynth.

Requirements

Python<=3.10

Installation

Clone this repository and install requirements

git clone git@github.com:wetoo-cando/blender-hoisynth.git
cd blender-hoisynth
pip install -r requirements.txt

Install UPBGE 0.30

Replace VR config file

mv scripts/defaults.py /path/to/UPBGE/blender/scripts/addons/viewport/vr_preview/configs/defaults.py

Clone blenderproc with SHA 31ebb06c2ea2581da25f1f3e4f9544c4b0cad8a4 to the repository

cd blender-hoisynth
git clone git@github.com:DLR-RM/BlenderProc.git
cd BlenderProc
git reset 31ebb06c2ea2581da25f1f3e4f9544c4b0cad8a4

Add rendering property files to BlenderProc

cd ..
mv scripts/write_dexycb_data2.py BlenderProc/blenderproc/python/writer/write_dexycb_data2.py
mv scripts/BopWriterUtility.py BlenderProc/blenderproc/python/loader/BlendLoader.py
mv scripts/BlendLoader.py BlenderProc/blenderproc/python/loader/BlendLoader.py

Add code

from blenderproc.python.writer.write_dexycb_data2 import write_dexycb_data2

to the end of BlenderProc/blenderproc/api/writer/__init__.py

Move the rendering folder to BlenderProc

mv rendering BlenderProc/examples

Add transforms3d to install_requires in BlenderProc/setup.py Add "transforms3d==0.4.1" to the end of BlenderProc/blenderproc/python/utility/DefaultConfig.py

Install blenderproc

cd BlenderProc
pip install -e .
python setup.py install
blenderproc pip install coloredlogs

Clone manopth to BlenderProc/examples/rendering

Move the scripts/generate_hand_pose.py to manopth

mv scripts/generate_hand_pose.py  BlenderProc/examples/rendering/manopth

Use

Download the blendfiles for the demo and the blender-hoisynth software. For the ICIP2024 reviewers, we have provided access details to the blendfiles in the supplementary materials. These materials will be released publicly conditioned on the acceptance of the paper for publication.

Recording

Recording should be done on Windows.

Objects initial poses can be extracted from DexYCB by running

python examples/rendering/generaate_initial_poses.py --DexYCB_dir /path/to/DexYCB --output_dir /path/to/output/dir

Open blender-hoisynth-v28 with UPBGE, record with VR headset.

Press P to enter play mode, R to record, Space to save recorded animation and D to drop recorded animation.

Rendering

A demo is provided in blendfiles to run the rendering.

Download the assets folder to blender-hoisynth for background objects generation

Run

cd BlenderProc
blenderproc run examples/rendering/render_animation.py --blend_dir /path/to/blend/files --assets_dir /path/to/assets/folder --pose_dir /path/to/object/initial/position/folder --output_folder /path/to/output/folder --Subject_id your subject id --hand_armature hand armature name

Mano parameters generation

Download the calibration folder from DexYCB and place it in blender-hoisynth/assets/calibration

Then run

python examples/rendering/manopth/generate_hand_pose.py --callibration_dir /path/to/calibration/folder -pose_dir /path/to/object/initial/position/folder --output_dir /path/to/render/results --mano_dir /mano/shape/parameter/folder/name --Subject_id your subject id

About

This repository is part of the supplementary materials for the ICIP 2024 submission titled "VR-based generation of photorealistic synthetic data for training hand-object tracking models". It provides several helpful scripts for rendering high-quality synthetic sequences of Hand-Object Interactions performed in VR using Blender.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages