Skip to content
🧀 Pytorch code for the Fromage optimiser.
Python Jupyter Notebook Other
Branch: master
Clone or download
Latest commit 9547d4b Feb 19, 2020
Type Name Latest commit message Commit time
Failed to load latest commit information.
classify-cifar Update Feb 19, 2020
classify-imagenet initial commit Feb 11, 2020
classify-mnist Update Feb 14, 2020
finetune-transformer fix bug in relative path Feb 14, 2020
generate-cifar initial commit Feb 11, 2020
make-plots Update the readme Feb 13, 2020
LICENSE initial commit Feb 11, 2020 Update Feb 19, 2020

Fromage 🧀 optimiser

Jeremy Bernstein·Arash Vahdat·Yisong Yue·Ming‑Yu Liu

Voulez-vous du fromage?

To get started with Fromage in your Pytorch code, copy the file into your project directory, then write:

from fromage import Fromage
optimizer = Fromage(net.parameters(), lr=0.01)

We found an initial learning rate of 0.01 worked well in all experiments except model fine-tuning, where we used 0.001. You may want to experiment with learning rate decay schedules.

About this repository

We've written an academic paper that proposes an optimisation algorithm based on a new geometric characterisation of deep neural networks. The paper is called:

On the distance between two neural networks and the stability of learning.

We're putting this code here so that you can test out our optimisation algorithm in your own applications, and also so that you can attempt to reproduce the experiments in our paper.

If something isn't clear or isn't working, let us know in the Issues section or contact

Repository structure

Here is the structure of this repository.

├── classify-cifar/         # CIFAR-10 classification experiments. ✅
├── classify-imagenet/      # Imagenet classification experiments. Coming soon! 🕒
├── classify-mnist/         # MNIST classification experiments. ✅
├── finetune-transformer/   # Transformer fine-tuning experiments. ✅
├── generate-cifar/         # CIFAR-10 class-conditional GAN experiments. Coming soon! 🕒
├── make-plots/             # Code to reproduce the figures in the paper. ✅
├── LICENSE                 # The license on our algorithm. ✅
├──               # The very page you're reading now. ✅
└──              # Pytorch code for the Fromage optimiser. ✅

Check back in a few days if the code you're after is missing. We're currently cleaning and posting it.



If you adore le fromage as much as we do, feel free to cite the paper:

    title={On the distance between two neural networks and the stability of learning},
    author={Jeremy Bernstein and Arash Vahdat and Yisong Yue and Ming-Yu Liu},


We are making our algorithm available under a CC BY-NC-SA 4.0 license. The other code we have used obeys other license restrictions as indicated in the subfolders.

You can’t perform that action at this time.