Skip to content

MultiModalWBC is a fully open-source, IsaacLab-based framework for multi-modal whole-body control, designed for motion imitation, motion tracking, and task-conditioned control in legged robots. The framework unifies robot proprioceptive states and multi-modal human motion conditions into a consistent interface

Notifications You must be signed in to change notification settings

Renforce-Dynamics/MultiModalWBC

Repository files navigation

Multi-Modal Whole-Body Control

License: BSD-3-Clause Python 3.10 OS Linux Isaac Sim 4.5 Isaac Lab 2.1.1 RSL--RL

Multi-Modal Whole-Body Control

Multi-Modal Whole-Body Control is a research-oriented framework for learning whole-body control policies for humanoid and general articulated robots from heterogeneous motion signals.
Built on NVIDIA Isaac Sim / Isaac Lab and RSL-RL, the framework targets large-scale parallel simulation and multi-modal imitation learning, with a focus on robust motion tracking and cross-modal embodiment.

The core philosophy of this repository is to treat whole-body control as a multi-modal sequence alignment problem:
a robot policy learns to coordinate its full-body dynamics by jointly conditioning on robot-centric states and external motion descriptors such as human body pose (SMPL-X) and SE(3) keypoints.

The framework currently focuses on Unitree G1, but is designed to be extensible to other humanoid or whole-body platforms.

Key Ideas

  • Whole-body control at scale
    Designed for thousands of parallel environments in Isaac Lab, enabling fast and stable on-policy learning.

  • Multi-modal motion supervision
    Policies can be conditioned on:

    • robot joint trajectories,
    • human SMPL-X full-body motion,
    • 3D / SE(3) keypoint trajectories.
  • Unified tracking & imitation
    Motion tracking, multi-motion training, and GAE-Mimic–style imitation are expressed within a single task framework.

Demo

See docs/illustrations/demo.mp4

Repository Structure

Click to expand repository structure

.
├── source/whole_body_control
│   ├── robots/              # Robot models and actuators (e.g., Unitree G1)
│   ├── tasks/               # Tracking / Multi-Tracking / GAEMimic tasks
│   ├── utils/               # Motion datasets, dataloaders, runners
│
├── scripts
│   ├── rsl_rl/
│   │   ├── train.py         # Training entry point
│   │   └── play.py          # Policy rollout (if available)
│   ├── tools/               # Env listing, random / zero agents
│   └── data/                # Dataset preprocessing utilities
│
├── third_party/rsl_rl       # Vendorized RSL-RL
├── docs/env_setup.md        # Detailed environment setup
└── README.md

Tasks and Environments

Click to expand tasks and environments

All environments are implemented using Isaac Lab task abstractions.

Motion Tracking

  • TrackingEnvCfg
    • Single reference motion tracking
    • Joint position control
    • Dense rewards on pose, velocity, orientation, contacts
    • Privileged observations for the critic

Multi-Motion Tracking

  • MultiTracking_TrackingEnvCfg
    • Samples from multiple motion clips
    • Dataset-driven command interface
    • Performance-based curriculum over motion difficulty

Multi-Modal Imitation (GAE-Mimic)

  • GAEMimic_TrackingEnvCfg
    • Conditions policy on:
      • robot joint trajectories,
      • SMPL-X human body pose,
      • SE(3) keypoint trajectories
    • Designed for cross-modal imitation and embodiment transfer

Robot Model

Click to expand robot model

Unitree G1

Defined in whole_body_control/robots/g1.py using:

  • ArticulationCfg
  • ImplicitActuatorCfg

Includes:

  • Full-body joint limits and initial posture
  • Per-joint-group actuator stiffness and damping
  • Action scaling for stable RL control

Installation

Click to expand installation steps
# Install Isaac Lab
git clone https://github.com/isaac-sim/IsaacLab.git
cd IsaacLab
git checkout 90b79bb2d44feb8d833f260f2bf37da3487180ba
./isaaclab.sh -i

# (Optional) install RSL-RL
./isaaclab.sh -p -m pip install -e path/to/rsl_rl

# Install this repository
cd source/whole_body_control
pip install -e .

Dataset Download

Click to expand dataset download

Preprocessed datasets (with SMPL-X and keypoints) are available at:

Unzip under the datasets/ directory.

Quick Start

Click to expand quick start

List Available Environments

python scripts/tools/list_envs.py

Train a Policy

python scripts/rsl_rl/train.py \
  --headless \
  --task MultiTracking-Flat-G1-v0

Common options:

  • --num_envs
  • --seed
  • --max_iterations
  • --video, --video_interval
  • --distributed (multi-GPU)

License

  • Isaac Lab components: BSD-3-Clause
  • RSL-RL: see third_party/rsl_rl
  • Robot assets (e.g., Unitree G1): subject to original licenses

About

MultiModalWBC is a fully open-source, IsaacLab-based framework for multi-modal whole-body control, designed for motion imitation, motion tracking, and task-conditioned control in legged robots. The framework unifies robot proprioceptive states and multi-modal human motion conditions into a consistent interface

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages