Skip to content

Commit

Permalink
add
Browse files Browse the repository at this point in the history
  • Loading branch information
zs-zhong committed Apr 14, 2021
1 parent ac9c354 commit bb5ba7e
Show file tree
Hide file tree
Showing 43 changed files with 731,295 additions and 22 deletions.
57 changes: 35 additions & 22 deletions README.md
Expand Up @@ -34,10 +34,10 @@ pip install -r requirements.txt
```

**Dataset Preparation**
* [CIFAR-10-LT, CIFAR-100-LT](https://www.cs.toronto.edu/~kriz/cifar.html)
* [ImageNet_LT](http://image-net.org/index)
* [CIFAR-10 & CIFAR-100](https://www.cs.toronto.edu/~kriz/cifar.html)
* [ImageNet](http://image-net.org/index)
* [iNaturalist 2018](https://github.com/visipedia/inat_comp/tree/master/2018)
* [Places_LT](http://places2.csail.mit.edu/download.html)
* [Places](http://places2.csail.mit.edu/download.html)

Change the `data_path` in `config/*/*.yaml` accordingly.

Expand All @@ -47,7 +47,7 @@ Change the `data_path` in `config/*/*.yaml` accordingly.

To train a model for Stage-1 with *mixup*, run:

(one GPU for CIFAR-10-LT, CIFAR-100-LT, four GPUs for ImageNet-LT, iNaturalist 2018, Places-LT)
(one GPU for CIFAR-10-LT & CIFAR-100-LT, four GPUs for ImageNet-LT, iNaturalist 2018, and Places-LT)

```
python train_stage1.py --cfg ./config/DATASETNAME/DATASETNAME_ARCH_stage1_mixup.yaml
Expand All @@ -64,6 +64,19 @@ To train a model for Stage-2 with *one GPU*, run:
```
python train_stage2.py --cfg ./config/DATASETNAME/DATASETNAME_ARCH_stage2_mislas.yaml resume /path/to/checkpoint/stage1
```

The saved folder (including logs and checkpoints) is organized as follows.
```
MiSLAS
├── saved
│ ├── model name + date
│ │ ├── ckps
│ │ │ ├── current.pth.tar
│ │ │ └── model_best.pth.tar
│ │ └── logs
│ │ │ └── model name.txt
│ ...
```
## Evaluation

To evaluate a pre-trained model, run:
Expand All @@ -81,23 +94,23 @@ python eval.py --cfg ./config/DATASETNAME/DATASETNAME_ARCH_stage2_mislas.yaml re

| Dataset | Top-1 Accuracy | ECE (15 bins) | Model |
| -------------------- | -------------- | ------------- | ----- |
| CIFAR-10-LT IF=10 | 87.6% | 11.9% | link |
| CIFAR-10-LT IF=50 | 78.1% | 2.49% | link |
| CIFAR-10-LT IF=100 | 72.8% | 2.14% | link |
| CIFAR-100-LT IF=10 | 59.1% | 5.24% | link |
| CIFAR-100-LT IF=50 | 45.4% | 4.33% | link |
| CIFAR-100-LT IF=100 | 39.5% | 8.82% | link |
| CIFAR-10-LT IF=10 | 87.6% | 11.9% | [link](https://drive.google.com/file/d/1dV1hchsIR5kTSqSOhdEs6nnXApcH5wEG/view?usp=sharing) |
| CIFAR-10-LT IF=50 | 78.1% | 2.49% | [link](https://drive.google.com/file/d/1LoczjQRK20u_HpFMLmzeT0pVCp3V-gyf/view?usp=sharing) |
| CIFAR-10-LT IF=100 | 72.8% | 2.14% | [link](https://drive.google.com/file/d/1TFetlV4MT4zjKEAPKcZuzmY2Dgtcqmsd/view?usp=sharing) |
| CIFAR-100-LT IF=10 | 59.1% | 5.24% | [link](https://drive.google.com/file/d/1BmLjPReBoH6LJwl5x8_zSPnm1f6N_Cp0/view?usp=sharing) |
| CIFAR-100-LT IF=50 | 45.4% | 4.33% | [link](https://drive.google.com/file/d/1l0LfZozJxWgzKp2IgM9mSpfwjTsIC-Mg/view?usp=sharing) |
| CIFAR-100-LT IF=100 | 39.5% | 8.82% | [link](https://drive.google.com/file/d/15dHVdkI8J-oKkeQqyj6FtrHtIpO_TYfq/view?usp=sharing) |

* Stage-2 (*MiSLAS*):

| Dataset | Top-1 Accuracy | ECE (15 bins) | Model |
| -------------------- | -------------- | ------------- | ----- |
| CIFAR-10-LT IF=10 | 90.0% | 1.20% | link |
| CIFAR-10-LT IF=50 | 85.7% | 2.01% | link |
| CIFAR-10-LT IF=100 | 82.5% | 3.66% | link |
| CIFAR-100-LT IF=10 | 63.2% | 1.73% | link |
| CIFAR-100-LT IF=50 | 52.3% | 2.47% | link |
| CIFAR-100-LT IF=100 | 47.0% | 4.83% | link |
| CIFAR-10-LT IF=10 | 90.0% | 1.20% | [link](https://drive.google.com/file/d/1iST8Tr2LQ8nIjTNT1CKiQ-1T-RKxAvqr/view?usp=sharing) |
| CIFAR-10-LT IF=50 | 85.7% | 2.01% | [link](https://drive.google.com/file/d/15bfA7uJsyM8eTwoptwp452kStk6FYT7v/view?usp=sharing) |
| CIFAR-10-LT IF=100 | 82.5% | 3.66% | [link](https://drive.google.com/file/d/1KOTkjTOhIP5UOhqvHGJzEqq4_kQGKSJY/view?usp=sharing) |
| CIFAR-100-LT IF=10 | 63.2% | 1.73% | [link](https://drive.google.com/file/d/1N2ai-l1hsbXTp_25Hoh5BSoAmR1_0UVD/view?usp=sharing) |
| CIFAR-100-LT IF=50 | 52.3% | 2.47% | [link](https://drive.google.com/file/d/1Z2nukCMTG0cMmGXzZip3zIwv2WB5cOiZ/view?usp=sharing) |
| CIFAR-100-LT IF=100 | 47.0% | 4.83% | [link](https://drive.google.com/file/d/1bX3eM-hlxGvEGuHBcfNhuz6VNp32Y0IQ/view?usp=sharing) |

*Note: To obtain better performance, we highly recommend changing the weight decay 2e-4 to 5e-4 on CIFAR-LT.*

Expand All @@ -107,17 +120,17 @@ python eval.py --cfg ./config/DATASETNAME/DATASETNAME_ARCH_stage2_mislas.yaml re

| Dataset | Arch | Top-1 Accuracy | ECE (15 bins) | Model |
| ----------- | ---------- | -------------- | ------------- | ----- |
| ImageNet-LT | ResNet-50 | 45.5% | 7.98% | link |
| iNa'2018 | ResNet-50 | 66.9% | 5.37% | link |
| Places-LT | ResNet-152 | 29.4% | 16.7% | link |
| ImageNet-LT | ResNet-50 | 45.5% | 7.98% | [link](https://drive.google.com/file/d/1QKVnK7n75q465ppf7wkK4jzZvZJE_BPi/view?usp=sharing) |
| iNa'2018 | ResNet-50 | 66.9% | 5.37% | [link](https://drive.google.com/file/d/1wvj-cITz8Ps1TksLHi_KoGsq9CecXcVt/view?usp=sharing) |
| Places-LT | ResNet-152 | 29.4% | 16.7% | [link](https://drive.google.com/file/d/1Tx-tY5Y8_-XuGn9ZdSxtAm0onOsKWhUH/view?usp=sharing) |

* Stage-2 (*MiSLAS*):

| Dataset | Arch | Top-1 Accuracy | ECE (15 bins) | Model |
| ----------- | ---------- | -------------- | ------------- | ----- |
| ImageNet-LT | ResNet-50 | 52.7% | 1.78% | link |
| iNa'2018 | ResNet-50 | 71.6% | 7.67% | link |
| Places-LT | ResNet-152 | 40.4% | 3.41% | link |
| ImageNet-LT | ResNet-50 | 52.7% | 1.78% | [link](https://drive.google.com/file/d/1ofJKlUJZQjjkoFU9MLI08UP2uBvywRgF/view?usp=sharing) |
| iNa'2018 | ResNet-50 | 71.6% | 7.67% | [link](https://drive.google.com/file/d/1crOo3INxqkz8ZzKZt9pH4aYb3-ep4lo-/view?usp=sharing) |
| Places-LT | ResNet-152 | 40.4% | 3.41% | [link](https://drive.google.com/file/d/1DgL0aN3UadI3UoHU6TO7M6UD69QgvnbT/view?usp=sharing) |

## <a name="Citation"></a>Citation

Expand Down
50 changes: 50 additions & 0 deletions config/cifar10/cifar10_imb001_stage1_mixup.yaml
@@ -0,0 +1,50 @@
name: cifar10_imb001_stage1_mixup
print_freq: 40
workers: 16
log_dir: 'logs'
model_dir: 'ckps'

# dataset & model setting
dataset: 'cifar10'
data_path: './data/cifar10'
num_classes: 10
imb_factor: 0.01
backbone: 'resnet32_fe'
resume: ''
head_class_idx:
- 0
- 3
med_class_idx:
- 3
- 7
tail_class_idx:
- 7
- 10


# distributed training
deterministic: False
distributed: False
gpu: null
world_size: -1
rank: -1
dist_url: 'tcp://224.66.41.62:23456'
dist_backend: 'nccl'
multiprocessing_distributed: False



# Train
mode: 'stage1'
lr: 0.1
batch_size: 128
weight_decay: 2e-4
num_epochs: 200
momentum: 0.9
cos: False
mixup: True
alpha: 1.0




54 changes: 54 additions & 0 deletions config/cifar10/cifar10_imb001_stage2_mislas.yaml
@@ -0,0 +1,54 @@
name: cifar10_imb001_stage2_mislas
print_freq: 40
workers: 16
log_dir: 'logs'
model_dir: 'ckps'

# dataset & model setting
dataset: 'cifar10'
data_path: './data/cifar10'
num_classes: 10
imb_factor: 0.01
backbone: 'resnet32_fe'
resume: 'Path/to/Stage1_checkpoint.pth.tar'
head_class_idx:
- 0
- 3
med_class_idx:
- 3
- 7
tail_class_idx:
- 7
- 10


# distributed training
deterministic: False
distributed: False
gpu: null
world_size: -1
rank: -1
dist_url: 'tcp://224.66.41.62:23456'
dist_backend: 'nccl'
multiprocessing_distributed: False



# Train
mode: 'stage2'
smooth_head: 0.3
smooth_tail: 0.0
shift_bn: False
lr_factor: 0.5
lr: 0.1
batch_size: 128
weight_decay: 2e-4
num_epochs: 10
momentum: 0.9
mixup: False
alpha: null





50 changes: 50 additions & 0 deletions config/cifar10/cifar10_imb002_stage1_mixup.yaml
@@ -0,0 +1,50 @@
name: cifar10_imb002_stage1_mixup
print_freq: 40
workers: 16
log_dir: 'logs'
model_dir: 'ckps'

# dataset & model setting
dataset: 'cifar10'
data_path: './data/cifar10'
num_classes: 10
imb_factor: 0.02
backbone: 'resnet32_fe'
resume: ''
head_class_idx:
- 0
- 3
med_class_idx:
- 3
- 7
tail_class_idx:
- 7
- 10


# distributed training
deterministic: False
distributed: False
gpu: null
world_size: -1
rank: -1
dist_url: 'tcp://224.66.41.62:23456'
dist_backend: 'nccl'
multiprocessing_distributed: False



# Train
mode: 'stage1'
lr: 0.1
batch_size: 128
weight_decay: 2e-4
num_epochs: 200
momentum: 0.9
cos: False
mixup: True
alpha: 1.0




54 changes: 54 additions & 0 deletions config/cifar10/cifar10_imb002_stage2_mislas.yaml
@@ -0,0 +1,54 @@
name: cifar10_imb002_stage2_mislas
print_freq: 40
workers: 16
log_dir: 'logs'
model_dir: 'ckps'

# dataset & model setting
dataset: 'cifar10'
data_path: './data/cifar10'
num_classes: 10
imb_factor: 0.02
backbone: 'resnet32_fe'
resume: 'Path/to/Stage1_checkpoint.pth.tar'
head_class_idx:
- 0
- 3
med_class_idx:
- 3
- 7
tail_class_idx:
- 7
- 10


# distributed training
deterministic: False
distributed: False
gpu: null
world_size: -1
rank: -1
dist_url: 'tcp://224.66.41.62:23456'
dist_backend: 'nccl'
multiprocessing_distributed: False



# Train
mode: 'stage2'
smooth_head: 0.2
smooth_tail: 0.0
shift_bn: False
lr_factor: 0.2
lr: 0.1
batch_size: 128
weight_decay: 2e-4
num_epochs: 10
momentum: 0.9
mixup: False
alpha: null





50 changes: 50 additions & 0 deletions config/cifar10/cifar10_imb01_stage1_mixup.yaml
@@ -0,0 +1,50 @@
name: cifar10_imb01_stage1_mixup
print_freq: 40
workers: 16
log_dir: 'logs'
model_dir: 'ckps'

# dataset & model setting
dataset: 'cifar10'
data_path: './data/cifar10'
num_classes: 10
imb_factor: 0.1
backbone: 'resnet32_fe'
resume: ''
head_class_idx:
- 0
- 3
med_class_idx:
- 3
- 7
tail_class_idx:
- 7
- 10


# distributed training
deterministic: False
distributed: False
gpu: null
world_size: -1
rank: -1
dist_url: 'tcp://224.66.41.62:23456'
dist_backend: 'nccl'
multiprocessing_distributed: False



# Train
mode: 'stage1'
lr: 0.1
batch_size: 128
weight_decay: 2e-4
num_epochs: 200
momentum: 0.9
cos: False
mixup: True
alpha: 1.0




0 comments on commit bb5ba7e

Please sign in to comment.