-
-
Notifications
You must be signed in to change notification settings - Fork 655
Closed
Labels
Description
🐛 Bug description
It seems that learning rate initialization for create_lr_scheduler_with_warmup
is not working well.
how to reproduce it
import torchvision.models as models
import torch.optim as optim
from ignite.handlers.param_scheduler import create_lr_scheduler_with_warmup
model = models.resnet18()
total_iteration = 100
warmup_iteration = 10
initial_lr = 1e-3
warmup_initial_lr = 1e-5
optimizer = optim.Adam(model.parameters(), lr=initial_lr)
lr_scheduler = create_lr_scheduler_with_warmup(optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=total_iteration),
warmup_start_value=warmup_initial_lr,
warmup_duration=warmup_iteration,
warmup_end_value=initial_lr)
for _ in range(total_iteration):
print(optimizer.param_groups[0]['lr'])
lr_scheduler(None)
Results
0.001
1e-05
0.00012
0.00023
...
2.9559615522887284e-05
I thought that learning rate of optimizer should have been 1e-5. But I got 1e-3.
Environment
- PyTorch Version (e.g., 1.4):
- Ignite Version (e.g., 0.3.0):
- OS (e.g., Linux):
- How you installed Ignite (
conda
,pip
, source): pip - Python version:
- Any other relevant information: Run on Colab