Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add BaseOptimizer, RMSProp, Adagrad, Adamax, Adadelta #23

Merged
merged 10 commits into from
Mar 4, 2023
Merged

Add BaseOptimizer, RMSProp, Adagrad, Adamax, Adadelta #23

merged 10 commits into from
Mar 4, 2023

Conversation

bhavnicksm
Copy link
Contributor

This pull request does the following:

  • Adds BaseOptimizer class
  • Adds SGD and Adam based on the Base Optimizer class
  • Adds Adagrad
  • Adds RMSProp
  • Adds Adadelta
  • Adds Adamax
  • Adds Momentum

@bhavnicksm bhavnicksm added the enhancement New feature or request label Mar 4, 2023
@bhavnicksm bhavnicksm added this to the v0.0.1 milestone Mar 4, 2023
@bhavnicksm bhavnicksm merged commit 3a9719c into OptimalFoundation:main Mar 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant