New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add new API for some optimizers #1026

Merged
merged 2 commits into from Jun 14, 2017

Conversation

Projects
None yet
4 participants
@ShangtongZhang
Member

ShangtongZhang commented Jun 13, 2017

As discussed here #1014, I added some new APIs for some optimizers.(ada_delta, ada_grad, adam, gradient_descent, minibatch_sgd, rmsprop, sgd, smorms3) I just added the new API as @zoq suggested and didn't delete the old one. So even this PR is merged, all the optimizers will still have consistent API to some extent. We need to continue refactoring all the optimizers and finally delete the old API. Maybe @rcurtin , @micyril and I can work on this together in the future. But I think now it's necessary to merge this PR so I can go further in the DQN PR and async RL methods.

* @param iterate Starting point (will be modified).
* @return Objective value of the final point.
*/
double Optimize(arma::mat& iterate);
double Optimize(DecomposableFunctionType& function, arma::mat& iterate);

This comment has been minimized.

@micyril

micyril Jun 13, 2017

Member

From my side (developing hyper-parameter tuning module) it's not enough to take function as a parameter in Optimize - function should be a template parameter for the method.

@micyril

micyril Jun 13, 2017

Member

From my side (developing hyper-parameter tuning module) it's not enough to take function as a parameter in Optimize - function should be a template parameter for the method.

This comment has been minimized.

@ShangtongZhang

ShangtongZhang Jun 13, 2017

Member

yeah that's true. We can do that when we refactor all the optimizers systematically. But for this PR it's enough.

@ShangtongZhang

ShangtongZhang Jun 13, 2017

Member

yeah that's true. We can do that when we refactor all the optimizers systematically. But for this PR it's enough.

@zoq

This comment has been minimized.

Show comment
Hide comment
@zoq

zoq Jun 13, 2017

Member

@ShangtongZhang can you take a look at the windows build issue (appveyor) I think that if you specialize Optimize(FunctionType& function, arma::mat&, parameters); it will work as well https://github.com/mlpack/mlpack/blob/master/src/mlpack/methods/regularized_svd/regularized_svd_function.hpp#L113

Member

zoq commented Jun 13, 2017

@ShangtongZhang can you take a look at the windows build issue (appveyor) I think that if you specialize Optimize(FunctionType& function, arma::mat&, parameters); it will work as well https://github.com/mlpack/mlpack/blob/master/src/mlpack/methods/regularized_svd/regularized_svd_function.hpp#L113

@rcurtin

This comment has been minimized.

Show comment
Hide comment
@rcurtin

rcurtin Jun 13, 2017

Member

This looks good to me once the Windows issue is fixed. From my side I will provide updated L-BFGS, AugLagrangian, SA, and SDP optimizer, with the API changes that @micyril needs. I hope to be able to complete that this week. Once I open that PR, @ShangtongZhang will you be able to finish refactoring these to take FunctionType as a template parameter only to the Optimize() function?

Member

rcurtin commented Jun 13, 2017

This looks good to me once the Windows issue is fixed. From my side I will provide updated L-BFGS, AugLagrangian, SA, and SDP optimizer, with the API changes that @micyril needs. I hope to be able to complete that this week. Once I open that PR, @ShangtongZhang will you be able to finish refactoring these to take FunctionType as a template parameter only to the Optimize() function?

@ShangtongZhang

This comment has been minimized.

Show comment
Hide comment
@ShangtongZhang

ShangtongZhang Jun 13, 2017

Member

Sure. and If you delete function parameter in the CTOR in those optimizers, I will do same thing for optimizers in this PR to make it consistent. And I'll look into the windows build issue tonight.

Member

ShangtongZhang commented Jun 13, 2017

Sure. and If you delete function parameter in the CTOR in those optimizers, I will do same thing for optimizers in this PR to make it consistent. And I'll look into the windows build issue tonight.

@zoq

This comment has been minimized.

Show comment
Hide comment
@zoq

zoq Jun 14, 2017

Member

I don't mind to merge this PR now, since it's backwards compatible. You could open another PR once @rcurtin modified the other optimizer classes, your decision.

Member

zoq commented Jun 14, 2017

I don't mind to merge this PR now, since it's backwards compatible. You could open another PR once @rcurtin modified the other optimizer classes, your decision.

@ShangtongZhang

This comment has been minimized.

Show comment
Hide comment
@ShangtongZhang

ShangtongZhang Jun 14, 2017

Member

It will be great if you @zoq can merge this PR, then I will modify DQN PR and make it merged. Then after @rcurtin opens his PR, I can provide another PR to modify these optimizers and refactor FFN, RNN, DQN and other influenced components, then these two PRs can be merged together.

Member

ShangtongZhang commented Jun 14, 2017

It will be great if you @zoq can merge this PR, then I will modify DQN PR and make it merged. Then after @rcurtin opens his PR, I can provide another PR to modify these optimizers and refactor FFN, RNN, DQN and other influenced components, then these two PRs can be merged together.

@rcurtin

This comment has been minimized.

Show comment
Hide comment
@rcurtin

rcurtin Jun 14, 2017

Member

Sure, merged. I am working on L-BFGS and the other optimizers now.

Member

rcurtin commented Jun 14, 2017

Sure, merged. I am working on L-BFGS and the other optimizers now.

@rcurtin rcurtin merged commit f7df912 into mlpack:master Jun 14, 2017

3 checks passed

Style Checks Build finished.
Details
continuous-integration/appveyor/pr AppVeyor build succeeded
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details

@ShangtongZhang ShangtongZhang referenced this pull request Jun 14, 2017

Merged

Basic DQN #1014

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment