-
Notifications
You must be signed in to change notification settings - Fork 17
Open
Description
In your example, warmup_steps=30000, steps_in_epoch=10000. And condition timestep < warmup_steps is met always.
I guess it's mistake of constants in provided example.
So I wanna use you lr scheduler, but I can't get it how to use it properly.
When I should use scheduler step() in one epoch train part ? Always? Or increment timestep each epoch and when timestep > warmup_steps I will use scheduler.step() ?
scheduler = WarmupReduceLROnPlateauScheduler(
optimizer,
init_lr=1e-10,
peak_lr=1e-4,
warmup_steps=30000,
patience=1,
factor=0.3,
)
for epoch in range(max_epochs):
for timestep in range(steps_in_epoch):
...
...
if timestep < warmup_steps:
scheduler.step()
val_loss = validate()
scheduler.step(val_loss)
Metadata
Metadata
Assignees
Labels
No labels