New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Optimizer API #1032

Merged
merged 19 commits into from Jun 27, 2017

Conversation

Projects
None yet
3 participants
@ShangtongZhang
Member

ShangtongZhang commented Jun 18, 2017

Finally made it successfully compile.. Let's see if some test fails unexpectedly.

@ShangtongZhang

This comment has been minimized.

Show comment
Hide comment
@ShangtongZhang

ShangtongZhang Jun 19, 2017

Member

Should we delete SoftmaxRegressionTrainTest and SoftmaxRegressionOptimizerTrainTest? The initialization includes randn, so we cannot expect they have same result as before. (It works before because we could specify the function to optimize at that time)

Member

ShangtongZhang commented Jun 19, 2017

Should we delete SoftmaxRegressionTrainTest and SoftmaxRegressionOptimizerTrainTest? The initialization includes randn, so we cannot expect they have same result as before. (It works before because we could specify the function to optimize at that time)

@rcurtin

This looks great, thank you for taking the time to make these changes. I think it's a big improvement, and helps up ensure that our abstractions make the code easier to work with and maintain (even though it's a lot of work to change those abstractions when we need...).

The two failing tests for softmax regression can be fixed; I think the fixes should be pretty easy. I pasted some code you can use in the comments.

Some of the comments I make apply to multiple places in the code, but I only made the comment once, so if you like, I can go back through and point out all instances. I wanted to avoid reviewing with 10000 comments though. :)

// If a user changed the batch size we have to update the restart fraction
// of the cyclical decay instantiation.
if (optimizer.BatchSize() != batchSize)
{
batchSize = optimizer.BatchSize();
optimizer.DecayPolicy().EpochBatches() = function.NumFunctions() /
double(batchSize);

This comment has been minimized.

@rcurtin

rcurtin Jun 20, 2017

Member

I'm not sure I follow the change here, can you explain why this was needed? I think it is probably a simple reason I am overlooking.

@rcurtin

rcurtin Jun 20, 2017

Member

I'm not sure I follow the change here, can you explain why this was needed? I think it is probably a simple reason I am overlooking.

This comment has been minimized.

@ShangtongZhang

ShangtongZhang Jun 21, 2017

Member

See the change of src/mlpack/core/optimizers/sgdr/cyclical_decay.hpp. In the past we compute epochBatches at CTOR, but now we cannot do this as we won't pass in function to CTOR anymore. So I just compute this value every time before calling the Optimize

@ShangtongZhang

ShangtongZhang Jun 21, 2017

Member

See the change of src/mlpack/core/optimizers/sgdr/cyclical_decay.hpp. In the past we compute epochBatches at CTOR, but now we cannot do this as we won't pass in function to CTOR anymore. So I just compute this value every time before calling the Optimize

@@ -97,29 +91,12 @@ class SMORMS3
* @param iterate Starting point (will be modified).
* @return Objective value of the final point.
*/
template<typename DecomposableFunctionType>

This comment has been minimized.

@rcurtin

rcurtin Jun 20, 2017

Member

I know this is a bit tedious, but can you add the @tparam DecomposableFunctionType ... documentation to the Optimize() functions please since it's now removed from the class documentation?

@rcurtin

rcurtin Jun 20, 2017

Member

I know this is a bit tedious, but can you add the @tparam DecomposableFunctionType ... documentation to the Optimize() functions please since it's now removed from the class documentation?

void Train(arma::mat predictors,
arma::mat responses,
OptimizerType<NetworkType, OptimizerTypeArgs...>& optimizer);
OptimizerType& optimizer);

This comment has been minimized.

@rcurtin

rcurtin Jun 20, 2017

Member

I think this is a nice improvement, this code is clearer now. :)

@rcurtin

rcurtin Jun 20, 2017

Member

I think this is a nice improvement, this code is clearer now. :)

Show outdated Hide outdated src/mlpack/methods/regularized_svd/regularized_svd_function.cpp
Show outdated Hide outdated src/mlpack/methods/softmax_regression/softmax_regression.hpp
Show outdated Hide outdated src/mlpack/methods/softmax_regression/softmax_regression.hpp
Show outdated Hide outdated src/mlpack/methods/sparse_autoencoder/sparse_autoencoder.hpp
Show outdated Hide outdated src/mlpack/tests/sgdr_test.cpp
Show outdated Hide outdated src/mlpack/tests/softmax_regression_test.cpp
Show outdated Hide outdated src/mlpack/tests/softmax_regression_test.cpp
@ShangtongZhang

This comment has been minimized.

Show comment
Hide comment
@ShangtongZhang

ShangtongZhang Jun 22, 2017

Member

Hope it is ready to merge now.

Member

ShangtongZhang commented Jun 22, 2017

Hope it is ready to merge now.

@zoq

zoq approved these changes Jun 23, 2017

arma::mat& accept,
arma::mat& moveSize,

This comment has been minimized.

@zoq

zoq Jun 23, 2017

Member

Do you mind to add a comment about moveSize, don't feel obligated we can do this later.

@zoq

zoq Jun 23, 2017

Member

Do you mind to add a comment about moveSize, don't feel obligated we can do this later.

@rcurtin

Looks good to me. This one has already been approved for a few days, so I'll let it sit two more days before merge in case anyone has any other comments.

@rcurtin rcurtin merged commit 97fd47d into mlpack:master Jun 27, 2017

3 checks passed

Style Checks Build finished.
Details
continuous-integration/appveyor/pr AppVeyor build succeeded
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details
@rcurtin

This comment has been minimized.

Show comment
Hide comment
@rcurtin

rcurtin Jun 27, 2017

Member

Thanks! I am happy with this change, I think it is a good improvement. :)

Member

rcurtin commented Jun 27, 2017

Thanks! I am happy with this change, I think it is a good improvement. :)

@rcurtin rcurtin referenced this pull request Jul 2, 2017

Merged

Classic FrankWolfe and OMP #1041

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment