Skip to content

ilguyi/optimizers.numpy

Repository files navigation

Various Optimizers based on Gradient Descent

  • Final update: 2018. 12. 15.
  • All right reserved @ Il Gu Yi 2018

Educational Purpose

  • Implementation various optimization algorithms based on gradient descent
  • Only use numpy, don't use deep learning framework like TensorFlow
  • Low level coding in each algorithm

Getting Started

Prerequisites

  • Python 3.6
    • numpy, matplotlib
  • Jupyter notebook
  • OS X and Linux (Not validated on Windows but probably it might work)

Contents

Linear Regression using Gradient Descent

Optimization of Beale Function using Various Gradient Descent Algorithms

Results

Optimization of Linear Regression using Various Gradient Descent Algorithms

regression_all regression_all.gif

Optimization of Beale Function using Various Gradient Descent Algorithms

all_test_optimizers all_test_optimizers.gif

References

Author

Il Gu Yi

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published