Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix use of decays #303

Merged
merged 1 commit into from
Jun 10, 2021
Merged

fix use of decays #303

merged 1 commit into from
Jun 10, 2021

Conversation

CarloLucibello
Copy link
Member

WeightDecay and similia should always be applied before the actual optimizer, otherwise their contribution to the gradient won't be multiplied by the learning rate.

cc @carlobaldassi

@CarloLucibello CarloLucibello merged commit 5d45300 into master Jun 10, 2021
bors bot added a commit to FluxML/Flux.jl that referenced this pull request Jun 10, 2021
1612: fix AdamW and improve decays docs r=DhairyaLGandhi a=CarloLucibello

There is great disorder under the sky with optimizers. Since in chaining optimizers 
```
opt = Optimizer(opt1, opt2)
```
the order generally matters (a lot!) we have to be very careful in documenting how to use decays. In fact, we were giving completely wrong indirections for `InvDecays` and `ExpDecays`. The correct ordering for standard use is

```julia
Optimizer(WeightDecay(), ADAM())   # equivalent to L2 regularization
Optimizer(ADAM(), InvDecay())   # learning rate scheduling
Optimizer(ADAM(), ExpDecay())   # learning rate scheduling
```
Different orderings are to be typically considered as bugs in user code. 

This PR fixes examples and tries to clarify documentation in this regard. 

Also fixes AdamW, which was doing something totally wrong due to the aforementioned confusion. 
(see https://towardsdatascience.com/why-adamw-matters-736223f31b5d for how AdamW works).

Related in model-zoo: FluxML/model-zoo#303 and FluxML/model-zoo#304




Co-authored-by: CarloLucibello <carlo.lucibello@gmail.com>
Co-authored-by: Carlo Lucibello <carlo.lucibello@unibocconi.it>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant