TrackerLab is a cutting-edge modular framework for humanoid motion retargeting, trajectory tracking, and skill-level control, built on top of IsaacLab.
Whether you're working with SMPL/FBX motion data, designing low-level whole-body controllers, or building skill graphs for high-level motion planning — TrackerLab brings everything together with a clean, extensible manager-based design.
Built to track, compose, and control humanoid motions — seamlessly from dataset to deployment.
G1 Debug | G1 Running | G1 Jump |
---|---|---|
-
🧠 IsaacLab-Integrated Motion Tracking Seamlessly plugs motion tracking into IsaacLab's simulation and control framework using manager-based abstraction.
-
🔁 Full Motion Retargeting Pipeline Converts SMPL/AMASS/FBX human motions into robot-specific trajectories with support for T-pose alignment, filtering, and interpolation.
-
🎮 Versatile Command Control Modes Switch between multiple control paradigms like ex-body pose control, PHC, and more—using the powerful CommandManager.
-
🔀 Skill Graph via FSM Composition Design complex motion behaviors using FSM-based skill graphs; supports manual triggers, planners, or joystick interfaces.
🎓 Want to understand TrackerLab quickly? 👉 Check out our full Tutorial (EN) or 教程 (中文版)
For assets and ckpts you could download from Asset Repo, where we collect the assets and make sure them works on both simulations and real world.
TrackerLab extends IsaacLab. Make sure IsaacLab and its dependencies are installed properly. Follow the official IsaacLab setup guide if needed.
# Clone TrackerLab
git clone https://github.com/interval-package/trackerlab.git
cd trackerlab
# Activate IsaacLab conda environment
conda activate <env_isaaclab>
# Install TrackerLab and poselib
pip install -e .
pip install -e ./poselib
💡 No extra packages or repos required — it's fully self-contained!
- Download motion datasets: AMASS or CMU FBX.
- Apply the retargeting process (see tutorial).
- Organize data under
./data/
as shown in data README.
- ✨ Fully modular and extensible
- 🤖 Designed for real-world humanoid control (e.g., Unitree H1)
- 📚 Clean codebase and manager-based environment design
- 🛠️ Easy integration of new motion sets and control modes
-
📁 Project Structure Understand TrackerLab’s layout and modular system.
-
🔄 Data Flow Learn how data flows through the tracking, retargeting, and control pipeline.
-
🔄 Problems Problems you may encounter is recorded.
New training and testing tasks are registered under:
trackerLab/tasks/
Custom Gym environments are recursively registered, including H1TrackAll
, and can be used directly with IsaacLab's training scripts.
Just add following lines into your train script:
import trackerLab.tasks
We also provide a copy from the orginal repo, for which you could directly run:
python scripts/rsl_rl/base/train.py --task H1TrackingWalk --headless
# H1 tasks that do not require generate usd, sine we use the isaaclab's usd, however it ruins the performance.
For play just directly play it like:
# on gui
python scripts/rsl_rl/base/play.py --task <Your task> --num_envs 32 # In your work space dir
# on saved video
python scripts/rsl_rl/base/play.py --task <Your task> --num_envs 32 --headless --video --video_length 500 # In your work space dir
If you find TrackerLab helpful for your work or research, please consider citing:
@software{zheng2025@trackerLab,
author = {Ziang Zheng},
title = {TrackerLab: One step unify IsaacLab with multi-mode whole-body control.},
url = {https://github.com/interval-package/trackerLab},
year = {2025}
}
Zaterval 📧 ziang_zheng@foxmail.com
Looking for collaborators and contributors — feel free to reach out or open an issue!
You can join the Wechat group for detialed contact!
This project is licensed under the MIT License. See LICENSE
for details.