Skip to content

bezirganyan/nnlib_py

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NNlibPy

A small Neural Network library for testing different gradient descent optimization algorithms.

Project

This library is written for studying different first order gradient descent algorithms for CS 213 Optimization course.

The aim of the project is to study some of the first order optimization algorithms used in Neural Networks and compare them in terms of accuracy, loss and convergence time on a simple feed-forward Neural Networks. In particular, the project will study Nesterov Accelerated Gradient Descent, AdaGrad, RMSprop and Adam optimization algorithms.

Dependencies

To run the code you need to have the follwing python packages installed:

  • Python 3
  • Numpy
  • MatPlotlib
  • Scikit-learn

FAQ

Will the project be continued?

Yes, I tend to continue this project by testing other algorithms and methods used in machine learning, adding new features, as well as fixing and improving the exisitng algorithms.

Do I need help?

Yes! I do need as much help, suggestion, advice or constructive criticism as possible.

How to contibute?

Just cone the project, add the features you think are helpful, then contact me so that we can merge it with existing project. For advice, suggestion or bugs you can open issues in the issues section.