This is an open collection of state-of-the-art (SOTA), novel Text to X (X can be everything) methods (papers, codes and datasets).
-
Updated
Sep 19, 2024
This is an open collection of state-of-the-art (SOTA), novel Text to X (X can be everything) methods (papers, codes and datasets).
(CVPR 2023) Pytorch implementation of “T2M-GPT: Generating Human Motion from Textual Descriptions with Discrete Representations”
Official implementation of CVPR24 highlight paper "Move as You Say, Interact as You Can: Language-guided Human Motion Generation with Scene Affordance"
Official implementation of "MoMask: Generative Masked Modeling of 3D Human Motions (CVPR2024)"
KMM: Key Frame Mask Mamba for Extended Motion Generation
Official Page of paper "TSTMotion: Training-free Scene-aware Text-to-motion Generation"
[CVPR 2022] OakInk: A Large-scale Knowledge Repository for Understanding Hand-Object Interaction
Official implementation of "TM2T: Stochastic and Tokenized Modeling for the Reciprocal Generation of 3D Human Motions and Texts (ECCV2022)"
Official implementation for "Generating Diverse and Natural 3D Human Motions from Texts (CVPR2022)."
HumanML3D: A large and diverse 3d human motion-language dataset.
Official implementations for "Action2Motion: Conditioned Generation of 3D Human Motions (ACM MultiMedia 2020)"
🔥 [ECCV 2024] Motion Mamba: Efficient and Long Sequence Motion Generation
[CVPRW 2024] Official Implementation of "in2IN: Leveraging individual Information to Generate Human INteractions".
[BMVC 2024] Motion Avatar: Generate Human and Animal Avatars with Arbitrary Motion
SignAvatars: A Large-scale 3D Sign Language Holistic Motion Dataset and Benchmark
NVIDIA-accelerated packages for arm motion planning and control
[IJCV 2024] InterGen: Diffusion-based Multi-human Motion Generation under Complex Interactions
MotionDiffuse: Text-Driven Human Motion Generation with Diffusion Model
InfiniMotion: Mamba Boosts Memory in Transformer for Arbitrary Long Motion Generation
Pytorch implementation of MoLA
Add a description, image, and links to the motion-generation topic page so that developers can more easily learn about it.
To associate your repository with the motion-generation topic, visit your repo's landing page and select "manage topics."