Skip to content

[CVPR 2023 Highlight] Official implementation of "NeMo: 3D Neural Motion Fields from Multiple Video Instances of the Same Action"

Notifications You must be signed in to change notification settings

wangkua1/nemo-cvpr2023

Repository files navigation

NeMo [CVPR2023 Highlight]

Project Page arXiv

This repo contains the official PyTorch implementation of our paper:

NeMo: 3D Neural Motion Fields from Multiple Video Instances of the Same Action
by Kuan-Chieh (Jackson) Wang, Zhenzhen Weng, Maria Xenochristou, Joao Pedro Araujo, Jeffrey Gu, C. Karen Liu, Serena Yeung

(Project Page 🌐 | Paper 📄 | Data 📀)

NeMo System Figure NeMo Demo

Installation

Environment

  1. Clone this repository.
git clone git@github.com:wangkua1/nemo-cvpr2023.git
  1. Create a conda environment using the provided environment file.
conda env create -f environment.yml

Then, activate the conda environment using

conda active nemo
  1. Pip install the missing packages using the provided requirements file.
pip install -r requirements.txt

Other software downloads

  1. Create a directory software.

  2. Download the required components:

  • "V02_05" -- requied by human_body_prior. Follow the original instructions at the VPoser github page.
  • "spin_data" -- Follow the original instructions at the SPIN github page.
  • "smpl" -- Follow the original instructions at their website.

Alternatively, download them at this link (~0.5GB). Note, we only provide these for the purpose of reproducing our work, please respect the original instruction, license, and copyright.

Dataset: NeMo-MoCap

  1. Download the dataset from this Google Drive folder (~1GB). You should organize your files into the following structure:
/nemo-cvpr2023
-- /data
  | -- /videos
  |   | -- <ACTION>.<INDEX>.mp4
  |   |  ......
  | -- /exps
  |   | -- /mymocap_<ACTION>
  |   |    | -- /<ACTION>.<INDEX>
  |   |    | -- /<ACTION>.<INDEX>.mp4_gt
  |   |    | -- /<ACTION>.<INDEX>.mp4_openpose
  |   |  ......
  | -- /mocap
  |   |    | -- <ACTION>.<INDEX>.pkl
  |   |  ......
  | -- opt_cam_IMG_6287.pt
  | -- opt_cam_IMG_6289.pt

  1. Convert the mp4 videos into frames (takes <3min).
python -m scripts.video_to_frames

Running NeMo

A example inference script for running NeMo on the "Baseball Pitch" motion on the NeMo-MoCap dataset is provided run_scripts/examples.
You can run it locally using the following command:

bash run_scripts_examples/nemomocap-example.sh 0

or launching a SLURM job with sbatch using

bash run_scripts_examples/nemomocap-example.sh 1

Running NeMo on your custom video dataset

If you wish to run NeMo on your own video dataset, refer to Custom Video README.

Acknowledgement

NeMo is built on many other great works, including VPoser, SPIN, SMPL, HMR, VIBE, DAPA, GLAMR.

Citation

If you find this work useful, please consider citing:

@inproceedings{wang2022nemo,
  title={NeMo: 3D Neural Motion Fields from Multiple Video Instances of the Same Action},
  author={Wang, Kuan-Chieh and Weng, Zhenzhen and Xenochristou, Maria and Araujo, Joao Pedro and Gu, Jeffrey and Liu, C Karen and Yeung, Serena},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2023},
  arxiv={2212.13660}
}

About

[CVPR 2023 Highlight] Official implementation of "NeMo: 3D Neural Motion Fields from Multiple Video Instances of the Same Action"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages