Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementation of LAMB optimizer #6868

Open
Atharva-Phatak opened this issue Oct 28, 2022 · 7 comments
Open

Implementation of LAMB optimizer #6868

Atharva-Phatak opened this issue Oct 28, 2022 · 7 comments
Assignees

Comments

@Atharva-Phatak
Copy link

Atharva-Phatak commented Oct 28, 2022

🚀 The feature

#6323 Mentions that torchvision is looking to implement LAMB optimizer.
@datumbox I would very much like to take this issue and create a PR.

Motivation, pitch

LAMB optimizer was created because LARS optimizer performed poorly on model with attention mechanism (mainly). LAMB has shown to achieve very good performance gains across various tasks and I believe that it should be implemented in torchvision.

Alternatives

No response

Additional context

No response

@Dsantra92
Copy link

Are you still working on this issue?

@Atharva-Phatak
Copy link
Author

@Dsantra92 Yes

@ashikshafi08
Copy link

Is anyone currently working on this, or can I plan to work on it?

@Atharva-Phatak
Copy link
Author

@ashikshafi08 Go ahead and implement it as I don't have the bandwidth to do so.

@ashikshafi08
Copy link

I am not sure where would I push this code. Like under what directory should the optimizers go in?

@datumbox
Copy link
Contributor

datumbox commented May 2, 2023

@ashikshafi08 See this PR which is in progress for adding LARS. pytorch/pytorch#88106

@janeyx99
Copy link
Contributor

janeyx99 commented May 2, 2023

Hey! To set some expectations on this from the core side: we currently do not have much review bandwidth for optimizers and so it may take a while for us to get around to verifying whether LAMB should live in core and then reviewing a LAMB implementation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants