Skip to content

code for "Neural Conservation Laws A Divergence-Free Perspective".

License

Notifications You must be signed in to change notification settings

zheng-sk/neural-conservation-law

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Conservation Laws: A Divergence-Free Perspective

This repo contains code for the NeurIPS 2022 paper https://arxiv.org/abs/2210.01741.

Building divergence-free neural networks

Experiments

Experiments on training Neural Conservation Laws (and baselines) for fluid simulation are available in the jax subdirectory.

Experiments on training Neural Conservation Laws (and baselines) for dynamical optimal transport are available in the pytorch subdirectory.

Citations

If you find this repository helpful for your publications, please consider citing our paper:

@inproceedings{
    richter-powell2022neural,
    title={Neural Conservation Laws: A Divergence-Free Perspective},
    author={Jack Richter-Powell and Yaron Lipman and Ricky T. Q. Chen},
    booktitle={Advances in Neural Information Processing Systems},
    year={2022},
}

License

This repository is licensed under the CC BY-NC 4.0 License.

About

code for "Neural Conservation Laws A Divergence-Free Perspective".

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%