-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Optimizer transition: Adam, Adadelta #555
Comments
You can basically use the RMSprop or the SGD optimizer in core/optimizers as a basis, to reimplement Adam and Adadelta. So, take a look at the RMSprop ann implementation (ann/optimizer/rmsprop.hpp) and compre the implementation with the new RMSprop implementation in core/optimizers/rmsprop_impl.hpp. Once there is an Adam and Adadelta implementation in core/optimizers, I'll go and delete the ann/optimizer folder. It would be great if you could start with one of the optimizers (Adam or Adadelta). Maybe someone likes to pick up the other optimizer, if not you are more than welcome to reimplement Adam and Adadelta. |
@zoq I'll take up Adadelta if not both. |
@zoq Then I will be working on Adam first. |
@zoq Hi! I implemented the transition on Adam. If you could have a look, that would be great. https://github.com/anamariabs/mlpack/tree/master/src/mlpack/core/optimizers/adam |
Looks good to me, please open a pull request, that also includes the test. |
Okay, maybe we can avoid such situations in the future by assigning the task to the first person who says, I like to work on the issue. I take a quick look at the pull requests and maybe we can find a way to include both requests.
Since @sshkhrnwbie was the first who said he would like to work on the issue I assign the issue to him, Adadelta is still open. Sorry @awhitesong, but I guess that's the best way to avoid such situations. |
I guess this is sort of my fault. I wanted to get familiar with the code, and all the tasks proposed on the mlpack gsoc site on the projects you mentor were sort of taken by someone else. So i just went along with one to find out if I can solve it and learn something in the process. Sure, it's fine by me to merge the pull requests. Also, if you could suggest me some different tasks, I'd like that. Thanks. |
@zoq , that will be great to do from next time. I just finished implementing adadelta with tests here https://github.com/vasanthkalingeri/mlpack/tree/master/src/mlpack/core/optimizers/adadelta. I am trying to make a pull request for this that does not include Adam, will be sending a pull request shortly. @anamariabs , it was a genuine communication gap so neither of us are to blame. |
@zoq Sure :) |
@zoq I would like to work on Adadelta. |
@anamariabs @vasanthkalingeri If you agree with my solution it would be great if you both could update your pull requests, if not it's fine, In this case, I'll go and work with the first pull request. @MurtyShikhar It looks like @sshkhrnwbie deleted his response to the issue, I'm not sure if this was a mistake. If he doesn't respond in the next two days I guess this issue is open again and you are free to work on this. |
We recently decided to use the optimizer in core/optimizers for the network modules instead of writing special optimizer just for the network modules. However, we already implemented some methods that aren't available for the rest of mlpack like Adam, Adadelta and RMSprop (methods/ann/optimizer). Actually, that's not true, we reimplemented RMSprop in da1207a and made the optimizer available for the rest of mlpack. It should be fairly easy to reimplement the other two methods as well.
The text was updated successfully, but these errors were encountered: