Skip to content

Commit

Permalink
perf(finetune): 添加权重衰减,设置随步长衰减为7轮
Browse files Browse the repository at this point in the history
  • Loading branch information
zjZSTU committed Mar 31, 2020
1 parent 37d5204 commit 38fc365
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion py/finetune.py
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@ def train_model(data_loaders, model, criterion, optimizer, lr_scheduler, num_epo

criterion = nn.CrossEntropyLoss()
# optimizer = optim.SGD(model.parameters(), lr=1e-3, momentum=0.9)
optimizer = optim.Adam(filter(lambda p: p.requires_grad, model.parameters()), lr=1e-4)
optimizer = optim.Adam(filter(lambda p: p.requires_grad, model.parameters()), lr=1e-4, weight_decay=1e-4)
lr_scheduler = optim.lr_scheduler.StepLR(optimizer, step_size=15, gamma=0.1)

best_model = train_model(data_loaders, model, criterion, optimizer, lr_scheduler, device=device, num_epochs=50)
Expand Down

0 comments on commit 38fc365

Please sign in to comment.