Skip to content

Commit

Permalink
Add 'Invertible Monotone Operators for Normalizing Flows' (#59)
Browse files Browse the repository at this point in the history
  • Loading branch information
byeongkeunahn committed Apr 4, 2023
1 parent bb7e1f3 commit b654230
Show file tree
Hide file tree
Showing 3 changed files with 12 additions and 2 deletions.
4 changes: 2 additions & 2 deletions data/make_readme.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ class Section(TypedDict):

def load_items(key: str) -> list[Item]:
"""Load list[Item] from YAML file."""
with open(f"{ROOT}/data/{key}.yml") as file:
with open(f"{ROOT}/data/{key}.yml", encoding="utf8") as file:
return yaml.safe_load(file.read())


Expand Down Expand Up @@ -162,7 +162,7 @@ def validate_item(itm: Item, section_title: str) -> None:
section["markdown"] += md_str + "\n\n"


with open(f"{ROOT}/readme.md", "r+") as file:
with open(f"{ROOT}/readme.md", "r+", encoding="utf8") as file:
readme = file.read()

for section in sections.values():
Expand Down
7 changes: 7 additions & 0 deletions data/publications.yml
Original file line number Diff line number Diff line change
Expand Up @@ -384,3 +384,10 @@
authors: Aditya Kallapa, Sandeep Nagar, Girish Varma
description: propose a k×k convolutional layer and Deep Normalizing Flow architecture which i) has a fast parallel inversion algorithm with running time O(nk^2) (n is height and width of the input image and k is kernel size), ii) masks the minimal amount of learnable parameters in a layer. iii) gives better forward pass and sampling times comparable to other k×k convolution-based models on real-world benchmarks. We provide an implementation of the proposed parallel algorithm for sampling using our invertible convolutions on GPUs.
repo: https://github.com/aditya-v-kallappa/FInCFlow

- title: 'Invertible Monotone Operators for Normalizing Flows'
url: https://arxiv.org/abs/2210.08176
date: 2022-10-15
authors: Byeongkeun Ahn, Chiyoon Kim, Youngjoon Hong, Hyunwoo J. Kim
description: This work proposes the monotone formulation to overcome the issue of the Lipschitz constants in previous ResNet-based normalizing flows using monotone operators and provides an in-depth theoretical analysis. Furthermore, this work constructs an activation function called Concatenated Pila (CPila) to improve gradient flow. The resulting model, Monotone Flows, exhibits an excellent performance on multiple density estimation benchmarks (MNIST, CIFAR-10, ImageNet32, ImageNet64).
repo: https://github.com/mlvlab/MonotoneFlows
3 changes: 3 additions & 0 deletions readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,9 @@ A list of awesome resources for understanding and applying normalizing flows (NF
1. 2023-01-03 - [FInC Flow: Fast and Invertible k×k Convolutions for Normalizing Flows](https://arxiv.org/abs/2301.09266) by Kallapa, Nagar et al.<br>
propose a k×k convolutional layer and Deep Normalizing Flow architecture which i) has a fast parallel inversion algorithm with running time O(nk^2) (n is height and width of the input image and k is kernel size), ii) masks the minimal amount of learnable parameters in a layer. iii) gives better forward pass and sampling times comparable to other k×k convolution-based models on real-world benchmarks. We provide an implementation of the proposed parallel algorithm for sampling using our invertible convolutions on GPUs. [[Code](https://github.com/aditya-v-kallappa/FInCFlow)]

1. 2022-10-15 - [Invertible Monotone Operators for Normalizing Flows](https://arxiv.org/abs/2210.08176) by Ahn, Kim et al.<br>
This work proposes the monotone formulation to overcome the issue of the Lipschitz constants in previous ResNet-based normalizing flows using monotone operators and provides an in-depth theoretical analysis. Furthermore, this work constructs an activation function called Concatenated Pila (CPila) to improve gradient flow. The resulting model, Monotone Flows, exhibits an excellent performance on multiple density estimation benchmarks (MNIST, CIFAR-10, ImageNet32, ImageNet64). [[Code](https://github.com/mlvlab/MonotoneFlows)]

1. 2022-08-18 - [ManiFlow: Implicitly Representing Manifolds with Normalizing Flows](https://arxiv.org/abs/2208.08932) by Postels, Danelljan et al.<br>
The invertibility constraint of NFs imposes limitations on data distributions that reside on lower dimensional manifolds embedded in higher dimensional space. This is often bypassed by adding noise to the data which impacts generated sample quality. This work generates samples from the original data distribution given full knowledge of perturbed distribution and noise model. They establish NFs trained on perturbed data implicitly represent the manifold in regions of maximum likelihood, then propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.

Expand Down

0 comments on commit b654230

Please sign in to comment.