Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SPSA optimizer implementation #69

Merged
merged 8 commits into from Jan 13, 2019

Conversation

@Rajiv2605
Copy link
Contributor

commented Jan 11, 2019

No description provided.

@Rajiv2605 Rajiv2605 referenced this pull request Jan 11, 2019
@Rajiv2605

This comment has been minimized.

Copy link
Contributor Author

commented Jan 11, 2019

@zoq Is there anything more to add to the code? Please give your suggestions :)

@zoq

This comment has been minimized.

Copy link
Member

commented Jan 11, 2019

I'm not sure I see the difference between #44 and this one, maybe I missed something?

@Rajiv2605

This comment has been minimized.

Copy link
Contributor Author

commented Jan 11, 2019

Yeah the code is the same :D It just comes from my fork as I will be able to contribute to it... I went through the code and I'm not so clear as to what addition I need to make... When the code was in mlpack, it wasn't passing the checks and also not compiling. So, I compiled ensmallen with SPSA and it compiled. Also it has passed the check. So, I'm not sure about what else to add to the code. It would be great if you could guide me here :)

@zoq

This comment has been minimized.

Copy link
Member

commented Jan 11, 2019

I see, I think the code is fine, if I remember right I adjusted some parameters; one thing left to do is to add the SPSA optimizer to the optimizer documentation. Will take a look at the code later today, and make comments if I see something else.

@Rajiv2605

This comment has been minimized.

Copy link
Contributor Author

commented Jan 11, 2019

Oh thanks! I'll get some content to add in the documentation 👍

@zoq

This comment has been minimized.

Copy link
Member

commented Jan 11, 2019

Sounds good, basically you can use the class documentation as a basis, see https://github.com/mlpack/ensmallen/blob/master/doc/optimizers.md for more information.

namespace ens {

/**
* Implementation of the SPSA methodT. The SPSA algorithm approximates the

This comment has been minimized.

Copy link
@zoq

zoq Jan 11, 2019

Member

Looks like we can remove the T here.

* year = {1998}
* }
* @endcode
*

This comment has been minimized.

Copy link
@zoq

zoq Jan 11, 2019

Member

Remove the extra line here and add something like:

* SGD can optimize differentiable separable functions. For more details, see
* the documentation on function types included with this distribution or on the
* ensmallen website.

* @param alpha Scaling exponent for the step size.
* @param batchSize Batch size to use for each step.
* @param gamma Scaling exponent for evaluation step size.
* @param a Scaling parameter for step size.

This comment has been minimized.

Copy link
@zoq

zoq Jan 11, 2019

Member

It might be useful to go with a more descriptive parameter name (a/c) something like EvaluationStepSize and StepSize.

This comment has been minimized.

Copy link
@Rajiv2605

Rajiv2605 Jan 12, 2019

Author Contributor

In the actual paper, the variables a and c are used. So, I have renamed them in the code as you have suggested and also mentioned about a and c in brackets for correlating with the paper. Is that fine?

This comment has been minimized.

Copy link
@zoq

zoq Jan 12, 2019

Member

Sounds good for me.

const double testAcc = lr.ComputeAccuracy(testData, testResponses,
coordinates);
REQUIRE(testAcc == Approx(100.0).epsilon(0.006)); // 0.6% error tolerance.
}

This comment has been minimized.

Copy link
@zoq

zoq Jan 11, 2019

Member

Missing new line at the end of the file.

@Rajiv2605

This comment has been minimized.

Copy link
Contributor Author

commented Jan 12, 2019

Sounds good, basically you can use the class documentation as a basis, see https://github.com/mlpack/ensmallen/blob/master/doc/optimizers.md for more information.

I have added SPSA to the documentation but the hyperlinks to the differentially seperable function and arbitrary function are not working.

rajiv2605 added some commits Jan 12, 2019

rajiv2605
rajiv2605
@zoq

This comment has been minimized.

Copy link
Member

commented Jan 12, 2019

Once the documentation is build the links should work.

// Include implementation.
#include "spsa_impl.hpp"

#endif

This comment has been minimized.

Copy link
@zoq

zoq Jan 12, 2019

Member

Missing new line.


} // namespace ens

#endif

This comment has been minimized.

Copy link
@zoq

zoq Jan 12, 2019

Member

Missing new line.


#### See also:

* [SPSA on Wikipedia](https://en.wikipedia.org/wiki/Simultaneous_perturbation_stochastic_approximation)

This comment has been minimized.

Copy link
@zoq

This comment has been minimized.

Copy link
@Rajiv2605

Rajiv2605 Jan 13, 2019

Author Contributor

Yeah it will be useful 👍

@zoq

zoq approved these changes Jan 13, 2019

Copy link
Member

left a comment

Everything looks good to me, thanks!

@zoq zoq merged commit d044675 into mlpack:master Jan 13, 2019

1 check passed

continuous-integration/travis-ci/pr The Travis CI build passed
Details
@Rajiv2605

This comment has been minimized.

Copy link
Contributor Author

commented Jan 14, 2019

@zoq Can I be added to the list of contributors? 😅

@zoq

This comment has been minimized.

Copy link
Member

commented Jan 14, 2019

You are already on the list: https://github.com/mlpack/ensmallen/blob/master/COPYRIGHT.txt#L29, but let me update the year.

@Rajiv2605

This comment has been minimized.

Copy link
Contributor Author

commented Jan 14, 2019

Oh sorry 😅 I was checking in the list in README

@zoq

This comment has been minimized.

Copy link
Member

commented Jan 14, 2019

Ahh, I see will fix that later today.

@zoq

This comment has been minimized.

Copy link
Member

commented Jan 14, 2019

Added in c3211af, thanks again for the contribution.

@Rajiv2605

This comment has been minimized.

Copy link
Contributor Author

commented Jan 14, 2019

Thanks! 😃

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants
You can’t perform that action at this time.