An integrated platform for generating realistic 3D human motion from text descriptions, combining Tencent's HY-Motion 1.0 text-to-motion system with SMPLX and Meta's MHR (Momentum Human Rig). It is compatible with Meta Movement SDK high-fidelity full-body avatar, which means it is XR-READY.
- Text-to-Motion Generation: Convert natural language descriptions into 3D human motion sequences using HY-Motion 1.0
- High-Fidelity Body Model: Meta MHR with identity (45 params), pose (204 params), and facial expression (72 params) control
- Format Conversion: Convert between SMPL, SMPLX, and MHR body model formats
- Multiple LOD Support: Generate meshes at 7 different levels of detail (LOD 0-6)
- Web Interface: Interactive Gradio UI for motion generation and visualization
- 3D Visualization: ScenePic-based interactive HTML visualizations
| Template | Preview | Description |
|---|---|---|
| Wooden Model | ![]() |
Original wooden from HY-Motion-1.0 |
| SMPLX | ![]() |
SMPLX body model |
| Meta Movement SDK | ![]() |
High-fidelity rig from Meta Movement SDK |
HMC/
├── app.py # Main Gradio web interface
├── conf.py # Output path configuration
├── pixi.toml # Pixi package manager config
├── HY-Motion-1.0/ # Text-to-Motion generation (submodule)
│ ├── hymotion/ # Core motion generation library
│ ├── ckpts/ # Model checkpoints
│ └── examples/ # Example prompts
├── MHR/ # Momentum Human Rig (submodule)
│ ├── mhr/ # Core MHR package
│ └── tools/ # Conversion & visualization tools
├── scripts/ # Conversion and visualization scripts
│ ├── simple_convert.py # SMPLX/FBX to MHR conversion
│ └── visualize_output.py # 3D mesh visualization
├── assets/ # Model assets and data files
│ ├── SMPLX_NEUTRAL.npz # SMPLX body model
│ ├── mhr_model.pt # MHR TorchScript model
│ ├── lod*.fbx # FBX rigged models (LOD 0-6)
│ └── *mapping.npz # Joint mapping files
└── output/ # Generated outputs (created at runtime)
- Python 3.12+
- CUDA 13.0+
- PyTorch 2.8.0+
This project is developed on Ubuntu 24.04 with one RTX 5090 GPU.
# Clone with submodules
git clone --recursive https://github.com/your-repo/HMC.git
cd HMC
# Download and unzip the model assets
curl -OL https://github.com/luffy-yu/HMC/releases/download/v1.0.0/assets.zip
unzip assets.zip -d assets/
# Install dependencies
pixi install
# Activate environment
pixi shellDownload the model checkpoints from HY-Motion-1.0 ckpts README and place them in the following directories:
| Model | Path |
|---|---|
| HY-Motion | ./HY-Motion-1.0/ckpts/tencent/HY-Motion-1.0/HY-Motion-1.0-Lite |
| Prompt Engineering | ./HY-Motion-1.0/ckpts/Text2MotionPrompter |
Text2MotionPrompterwould be unnecessary if prompt engineering is disabled.
Launch the Gradio web interface:
python app.py
# Disable prompt engineering (required even for RTX 5090)
python app.py --disable-prompt-engineeringAccess the interface at http://localhost:7860
Command-line options:
| Option | Description | Default |
|---|---|---|
--host |
Server host address | localhost |
--port |
Server port number | 7860 |
--disable-prompt-engineering |
Disable text rewriting and duration estimation | False |
--disable-prompt-engineeringis even needed for RTX 5090 GPU.
Examples:
python app.py --host 0.0.0.0 --port 7860 # Run on all interfaces, port 7860
python app.py --disable-prompt-engineering # Disable prompt engineering
python app.py --help # Show all optionsGenerate motion from text:
cd HY-Motion-1.0
python local_infer.py --prompt "A person walks forward slowly"Convert SMPLX to MHR:
# Default template (no fbx output)
python scripts/simple_convert.py --smplx assets/SMPLX_NEUTRAL.npz -i assets/tpose_poses.npy -o output/default_template
# SMPLX template (copy animation from --smplx to lod1.fbx and output walking.fbx to output/smplx_template)
python scripts/simple_convert.py --smplx assets/walking.fbx --fbx-template assets/lod1.fbx -o output/smplx_template
# Meta Movement SDK Avatar template (copy animation from --smplx to high_fidelity_rig.fbx and output walking.fbx to output/smplx_template)
python scripts/simple_convert.py --smplx assets/walking.fbx --fbx-template ./assets/high_fidelity_rig.fbx -o output/hifi_templatehigh_fidelity_rig.fbx is from
Meta Movement 72.0.0(com.meta.movement, https://github.com/oculus-samples/Unity-Movement.git,Assets\Samples\Meta Movement\72.0.0\Advanced Samples\HipPinning\Models\high_fidelity_rig.fbx).
Visualize output:
# Use default output name
python scripts/visualize_output.py --output_dir output/default_template
# Customize the html file name
python scripts/visualize_output.py --output_dir output/smplx_template --html_name my_visualization.html
# Loop folders to generate
python scripts/visualize_output.py --output_dir output/- Motion Sequences:
.npy,.fbx - 3D Meshes:
.ply,.fbx - Visualizations: Interactive HTML files
- Tencent HY-Motion 1.0 - Text-to-Motion generation
- Meta MHR - Momentum Human Rig body model
- SMPL-X - Expressive body model
See individual submodule licenses for HY-Motion and MHR components.


