Skip to content

whitneychiu/lipmlp_pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Learning Smooth Neural Functions via Lipschitz Regularization (PyTorch)

This is the re-implementation of the paper in PyTorch: Learning Smooth Neural Functions via Lipschitz Regularization Hsueh-Ti Derek Liu, Francis Williams, Alec Jacobson, Sanja Fidler, Or Litany SIGGRAPH (North America), 2022 [Preprint]

Dependencies

This re-implementation depends on PyTorch and common python dependencies (e.g., numpy, tqdm, matplotlib, etc.). Some functions in the script, such as generating analytical signed distance functions and the lipschitz linear layer, depend on functions in the folder utils and models respectively.

Repository Structure

  • 2D_interpolation contains the script to train a Lipschitz MLP to interpolate 2D signed distance functions of a cross and a star. main_mlp.py and main_lipmlp.py is the main training script for a simple mlp and a lipschitz mlp, respectively. To train the model from scratch, simply run
python main_lipmlp.py

After training, you should see the interpolation results in the sub-folder mlp and lipmlp and the model parameters in mlp_params.pt and lipmlp_params.pt.

Results

Interpolation omparison between mlp and lipmlp

Contact

This is my re-implementation of the paper. If there are any questions, please contact Whitney Chiu wchiu@gatech.edu

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages