Skip to content
A Re-implementation of Fixed-update Initialization
Branch: master
Clone or download
Latest commit f3fa449 May 5, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
cifar CIFAR-10 training Mar 15, 2019
fairseq transformer + fixup on IWSLT May 5, 2019
imagenet add image net training script Apr 28, 2019
.gitignore Initial commit Mar 15, 2019
LICENSE Initial commit Mar 15, 2019 Update May 5, 2019


A Re-implementation of Fixed-update Initialization ( (requires Pytorch 1.0)

Cite as:

Hongyi Zhang, Yann N. Dauphin, Tengyu Ma. Fixup Initialization: Residual Learning Without Normalization. 7th International Conference on Learning Representations (ICLR 2019).

ResNet for CIFAR-10

The default arguments will train a ResNet-110 ( with Fixup + Mixup (


The following script will train a ResNet-32 model ( on GPU 0 with Fixup and no Mixup (alpha=0), with weight decay 5e-4 and (the default) learning rate 0.1 and batch size 128.

CUDA_VISIBLE_DEVICES=0 python -a fixup_resnet32 --sess benchmark_a0d5e4lr01 --seed 11111 --alpha 0. --decay 5e-4

ResNet for ImageNet

ImageNet models with training scripts are now available. (Thanks @tjingrant for help!)

Top-1 accuracy for ResNet-50 at Epoch 100 with Mixup (alpha=0.7) is around 76.0%.

Transformer for machine translation

Transformer model with Fixup (instead of layer normalization) is available. To run the experiments, you will need to download and install the fairseq library (the provided code was tested on an earlier version: You can then copy the files into corresponding folders.

An example script is provided to run the IWSLT experiments described in the paper. For more information, please refer to the instructions in fairseq repo (

You can’t perform that action at this time.