Skip to content

yz-cnsdqz/DOMA-release

Repository files navigation

Degrees of Freedom Matter: Inferring Dynamics from Point Trajectories

page, paper

License

The code is distributed under the MIT License.

Install

pip install -r requirements.txt

Essential python libraries:

PyYAML==6.0.1
torch==2.0.1
torchgeometry==0.1.2
trimesh==3.23.5
numpy==1.23.1

Data

DeformingThings4D

We exploited this dataset for novel point motion prediction. Before downloading this dataset, one should check its official site and agree its term of use. Please see our extended ResField for how we processed this dataset.

Synthetic

We created this dataset with 4 sequences for novel point motion prediction. One can download our version here, or generate new synthetic data via utils/gen_syntheticdataset_3d.py.

Resynth

We exploited this dataset for temporal mesh alignment with guidance. Please see its official website for more information and downloading.

In our experiment, we choose 16 sequences from 4 subjects in the packed sequences in the test split. For each sequence, we first perform down-sampling by every 2 frames, and then select the first 30 frames as the new sequence to model. The employed sequences are present in our paper supp. mat.

Particle Simulation

We conducted an additional experiment in the supp. mat, and leveraged this github repo to produce the data. Running and extracting the simulated particles require some basic knowledge of Unity3D. Alternatively, one can download our extracted and processed data here.

Usage

The training, evaluation, and rendering scripts are in these train_*.py files.

Novel Point Motion Prediction

Besides the three baselines implemented in our modified ResField repo, one can run set up the dataset path, and run e.g.

python train_deformingthings4d.py --motion_model_name=affinefield4d 

All options of motion_model_name are listed in the MotionField class in models/dpfs.py.

The results are saved into the output folder. One can modify the point renderer settings in render_points() of utils/vis.py. Details of the configurations are in train_deformingthings4d.py.

Similarly, one can use the following to learn motion fields for the 4 synthetic sequences.

python train_synthetic.py --motion_model_name=affinefield4d --homo_loss_weight=0.1

Run the following one to learn the motion field in the simulated fluid field.

python train_particlesim.py --sequence=/doma_datasets/particlesim/particles_0000.npy --motion_model_name=transfield4d --start=100 --end=110

Temporal Mesh Alignment with Guidance

One can start with train_resynth.py, e.g.

python train_resynth.py --subject=rp_aaron_posed_002 --seq=96_jerseyshort_hips --motion_model_name=affinefield4d

Citation

@inproceedings{DOMA,
    title   = {Degrees of Freedom Matter: Inferring Dynamics from Point Trajectories},
    author  = {Yan Zhang and Sergey Prokudin and Marko Mihajlovic and Qianli Ma and Siyu Tang},
    booktitle = {Proceedings IEEE Conf. on Computer Vision and Pattern Recognition (CVPR)},
    year    = {2024}
}

Related Projects

@inproceedings{mihajlovic2024ResFields,
   title={{ResFields}: Residual Neural Fields for Spatiotemporal Signals},
   author={Mihajlovic, Marko and Prokudin, Sergey and Pollefeys, Marc and Tang, Siyu},
   booktitle={International Conference on Learning Representations (ICLR)},
   year={2024}
} 

@inproceedings{prokudin2023dynamic,
    title = {Dynamic Point Fields},
    author = {Prokudin, Sergey and Ma, Qianli and Raafat, Maxime and Valentin, Julien and Tang, Siyu},
    booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
    pages = {7964--7976},
    month = oct,
    year = {2023}
}

About

official implementation of [Degrees of Freedom Matter: Inferring Dynamics from Point Trajectories, CVPR'24]

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published