Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement SGD under Algorithms/GradientOptimizers.hpp #2

Open
JackHunt opened this issue Dec 6, 2022 · 0 comments
Open

Implement SGD under Algorithms/GradientOptimizers.hpp #2

JackHunt opened this issue Dec 6, 2022 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@JackHunt
Copy link
Owner

JackHunt commented Dec 6, 2022

Is your feature request related to a problem? Please describe.
Currently there is only a base optimiser class, there should be at least one concrete optimiser.

Describe the solution you'd like
An implementation (CPU only as a start) of simple SGD (no momentum, nester etc) following the API of GradientOptimizerBase. Simple test cases testing convergence and documentation should also be present.

Describe alternatives you've considered
Removing GradientOptimizerBase until a concrete optimiser is required, but this seems wasteful.

Additional context
Should not contain f and df implementations, only implementing the logic of the optimiser. The mathematical functions should be user provided.

@JackHunt JackHunt added the enhancement New feature or request label Dec 6, 2022
@JackHunt JackHunt self-assigned this Dec 6, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant