This code works with the basic DNO conda environment. All additions to the environment have been added to the SETUP.sh script. This will clone two additional repositories crucial to the forward kinematics calculation and results visualization and install their dependencies automatically. The rest of the setup does not vary from that of DNO so the following commands are largely unchanged. Only the first Text to Motion dependency is crucial for running DNO.
Setup conda env:
conda env create -f environment_gmd.yml
conda activate gmd
python -m spacy download en_core_web_sm
pip install git+https://github.com/openai/CLIP.gitDownload submodules:
bash SETUP.shDownload dependencies:
bash prepare/download_smpl_files.sh
bash prepare/download_glove.sh
bash prepare/download_t2m_evaluators.shThere are two paths to get the data:
(a) Generation only wtih pretrained text-to-motion model without training or evaluating
(b) Get full data to train and evaluate the model.
HumanML3D - Clone HumanML3D, then copy the data dir to our repository:
cd ..
git clone https://github.com/EricGuo5513/HumanML3D.git
unzip ./HumanML3D/HumanML3D/texts.zip -d ./HumanML3D/HumanML3D/
cp -r HumanML3D/HumanML3D Diffusion-Noise-Optimization/dataset/HumanML3D
cd Diffusion-Noise-OptimizationHumanML3D - Follow the instructions in HumanML3D, then copy the result dataset to our repository:
Then copy the data to our repository
cp -r ../HumanML3D/HumanML3D ./dataset/HumanML3DDownload our version of MDM, then unzip and place it in ./save/.
The model is trained on the HumanML3D dataset.
-
To enable or disable different differentiable dynamics loss terms (for example enabling/disbling angular dynamics or simply to revert back to baseline DNO) the corresponding boolean flags can be set in
sample/condition.py:com_term = True angular_term = False
-
To run a demo, you can simply run a provided motion generation script, for example
motion_gen_jumping.sh. To get the correct pose targets, the values within the script can be copied intosample/dno_helper.py:def task_pose_editing(task_info, args, target, target_mask): target_edit_list = [ # (joint_index, keyframe, edit_dim, target(x, y, z)) (0, 20, [0], [-0.2192]), (0, 20, [2], [0.8807]), (0, 40, [0], [-1.1151]), (0, 40, [2], [1.5184]) ]
This directory implements a complete physics validation pipeline that transforms SMPL motion sequences into biomechanically accurate skeletal motion with force analysis. The pipeline consists of four main stages:
- SMPL → Anatomical Joint Mapping (Sparse Regression)
- Inverse Kinematics Fitting ("SKELify")
- Biomechanical Modeling (Center of Mass & Inertia)
- Contact Forces & Dynamics Validation
inverse_kinematics.py- Main entry point that orchestrates the complete pipeline- Function:
run_ik(input_joints, debug=False)- Processes SMPL joints through full biomechanical analysis - Returns: physics loss metrics, contact forces, and SKEL fitting results
- Function:
train.py- Trains the sparse linear regressor that maps SMPL joints to anatomical joint positions- Uses stratified pose sampling and Lasso regression with anatomical locality constraints
- Achieves ~1cm RMSE with >89% sparsity
test.py- Quick test script to validate pipeline functionality and compute physics losses
visualize.py- Creates detailed biomechanical visualizations including:- Ground reaction forces (GRF) at center of pressure
- Center of mass trajectory and required forces
- Contact spheres and force vectors
- Configure via
npy_fileandoutput_dirvariables
anatomical_joint_regressor.py- Sparse linear regression from SMPL to anatomical jointsanatomical_joint_ik_adam.py- Inverse kinematics optimizer for SKEL model fittingcenter_of_mass_calculator.py- Biomechanical properties calculation (CoM, inertia, angular momentum)contact_models_torch.py- Ground contact detection and force estimation using Kelvin-Voigt modellosses.py- Physics validation losses implementing Newton's 2nd law and Euler's equationsjoints_utils.py- Joint manipulation and coordinate transformation utilitiesvisualization.py- Plotting and animation utilities for biomechanical analysisanimation.py- Advanced animation rendering for motion sequencesbody_properties_output.py- Anthropometric data and segment propertiescalc_dist.py- Distance and geometric calculations
smpl_to_osim_regressor_male.pt- Pre-trained sparse regression model*_data_male.npy- Training data for SMPL and anatomical joint correspondencestest_*.npy- Validation datasets and predictions
- Stores intermediate results, fitted models, and analysis outputs
- Contains visualizations, fitted poses, and physics validation metrics
from inverse_dynamics.inverse_kinematics import run_ik
import torch
# Load SMPL motion data (shape: [frames, joints, 3])
smpl_joints = torch.load("path/to/smpl_motion.pt")
# Run complete physics pipeline
losses, contact_forces, skel_results = run_ik(
input_joints=smpl_joints,
debug=True # Enable detailed logging
)
print(f"Linear dynamics loss: {losses['translational_loss']}")
print(f"Angular dynamics loss: {losses['rotational_loss']}")from inverse_dynamics.utils.anatomical_joint_regressor import SparseSMPLtoAnatomicalRegressor
regressor = SparseSMPLtoAnatomicalRegressor(
output_dir="./inverse_dynamics/regressor",
gender="male"
)
regressor.run_pipeline(
num_samples=20000,
alpha=0.0005, # Lasso regularization strength
learning_rate=0.005,
num_epochs=2000
)Edit variables in inverse_dynamics/visualize.py:
npy_file = "save/your_motion_results/results.npy"
output_dir = "./inverse_dynamics/output"Then generate biomechanical visualizations:
python inverse_dynamics/visualize.pyThe pipeline computes two key physics consistency measures:
Validates Newton's 2nd law: F_GRF = m(a_COM + g)
- Compares ground reaction forces with required forces at center of mass
- Lower values indicate better force balance
Validates Euler's equation: M_GRF + M_gravity = dL/dt
- Compares ground reaction moments with angular momentum rate of change
- Accounts for rotational consistency
For high-quality mesh rendering with surface geometry, use aitviewer-skel (requires python >= 3.9).
python external/aitviewer-skel/examples/load_SKEL_with_real_contact.py \
-s '/path/to/jumping_ik_results.pkl' \
-c '/path/to/contact_output.pt' \
--force_scale 0.002 --sphere_radius 0.032python external/aitviewer-skel/examples/load_SKEL_with_dynamics_analysis.py \
-s '/path/to/jumping_ik_results.pkl' \
-c '/path/to/contact_output.pt' \
-m '/path/to/com_analysis_results.json' \
--force_scale 0.0006Implements Newton's second law: F_GRF = m(a_COM + g)
- Blue vectors: Ground reaction forces at center of pressure
- Green vectors: Required forces at center of mass
- Red sphere: Center of mass with trajectory trail
- Orange square: Center of pressure
Implements Euler's equation: M_GRF + M_gravity = dL/dt
- Blue vectors: Moment from ground reaction forces
- Magenta vectors: Moment from gravity
- Green vectors: Rate of change of angular momentum
This biomechanical pipeline enables detailed physics validation for motion generation algorithms and provides tools for understanding human movement dynamics in applications ranging from animation to clinical biomechanics.



