Skip to content

Latest commit

 

History

History
80 lines (60 loc) · 4.27 KB

README.md

File metadata and controls

80 lines (60 loc) · 4.27 KB

Various Optimizers based on Gradient Descent

  • Final update: 2018. 12. 15.
  • All right reserved @ Il Gu Yi 2018

Educational Purpose

  • Implementation various optimization algorithms based on gradient descent
  • Only use numpy, don't use deep learning framework like TensorFlow
  • Low level coding in each algorithm

Getting Started

Prerequisites

  • Python 3.6
    • numpy, matplotlib
  • Jupyter notebook
  • OS X and Linux (Not validated on Windows but probably it might work)

Contents

Linear Regression using Gradient Descent

Optimization of Beale Function using Various Gradient Descent Algorithms

Results

Optimization of Linear Regression using Various Gradient Descent Algorithms

regression_all regression_all.gif

Optimization of Beale Function using Various Gradient Descent Algorithms

all_test_optimizers all_test_optimizers.gif

References

Author

Il Gu Yi