Some tricks to improve flow networks
The Modular
branch is maintained currently.
The following are implemented -
- Faster fft of real valued vectors by exploiting complex symmetry of the fourier transforms.
- The major contribution here was to register gradients for
tf.irfft3d
and writing an implementation forrfft3d
- The major contribution here was to register gradients for
- Gradient checkpointing for invertible networks allowing for constant memory backprop
gradient_checkpointing.py
contains relevant layer and model classes which can be generalized to other models.
- Variational dequantization according to this paper.
dequantization.py
contains theFlowWithDequant
class which can wrap around any flow to make it dequantized.