Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a publication of Ranger? #27

Closed
nuzrub opened this issue Apr 4, 2020 · 2 comments
Closed

Is there a publication of Ranger? #27

nuzrub opened this issue Apr 4, 2020 · 2 comments

Comments

@nuzrub
Copy link

nuzrub commented Apr 4, 2020

I want to cite ranger on a Medium article and I would like to know if there is an arXiv publication of Ranger or a published peer-reviewed paper on some conference or journal.

I saw you linked a paper o the README.md, but it does not seem to be about ranger, as the very word does not appear in any part of it. I know the Radam and Lookahead paper, but the Ranger one is missing on my library. Thanks

@lessw2020
Copy link
Owner

Hi @nuzrub - thanks for checking. There's no published paper on Ranger at this point, just the medium article where I introduced it: https://medium.com/@lessw/new-deep-learning-optimizer-ranger-synergistic-combination-of-radam-lookahead-for-the-best-of-2dc83f79a48d
Most papers have just cited the medium article.
I keep thinking soon we'll have an automated optimizer that does auto learning rate so never pushed towards a paper :) but looks like that time is still further out.
Good luck with your article and feel free to post a link here.
Hope you are doing well!

@nuzrub
Copy link
Author

nuzrub commented Apr 5, 2020

Thanks for replying. I do believe you should take sometime to write about Ranger, as I believe it is a great idea. In today's arXiv world, publishing ideas to the scientific community does not need to be as time consuming was it used to be with just journals and conferences.
Auto learning rate is something I believe would be great to all. However, I also believe there will always be some other parameter at the optimizer level. If we find a way to automatically search a good learning rate, there is going to be another parameter to control how fast this automatic detection should adapt. We are all slaves to hyper parameters in the end :x
Thanks for replying, I will let you know when I publish my article o/

@nuzrub nuzrub closed this as completed Apr 5, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants