PyTorch implementation of Filter Response Normalization Layer(FRN)
Replace BatchNorm2d + ReLU
in the model with FRN + TLU
yourself.
Currently, it is difficult to easily replace them with functions.
Because many models use the same ReLU in various places.
We use Best Artworks of All Time | Kaggle dataset.
This dataset contains 49 artists and their pictures.
In this experiment, we classify artist by picture.
- torch==1.3.1
- catalyst==19.11.6
- albumentations==0.4.3
- NVIDIA/apex
- If you use
--fp16
option
- If you use
If you can use kaggle API command, you can download easily
$ cd input
$ kaggle datasets download -d ikarus777/best-artworks-of-all-time
$ unzip best-artworks-of-all-time.zip -d artworks
Or download directly from Best Artworks of All Time | Kaggle
I assume the following directory structure.
input
├── artworks
│ ├── artists.csv
│ ├── images
│ │ └── images
│ │ ├── Alfred_Sisley
│ │ │ ├── Alfred_Sisley_1.jpg
│ │ │ ├── Alfred_Sisley_10.jpg
│ │ │ ├── ...
You can use --fp16
if you installed nvidia/apex
.
But FRN is not tuned for FP16, you should turn off --fp16
when use --frn
.
$ python train_cls.py --fp16
$ python train_cls.py --frn