Skip to content

Implementation of Artificial Neural Network with BRU activation functions in Python.

Notifications You must be signed in to change notification settings

marek-kan/Implementation_of_Artificial_Neural_Network

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 

Repository files navigation

Implementation of Artificial Neural Network

Implementation of Artificial Neural Network with Bionodal activation functions in Python.

I use batch gradient descent with momentum in learning phase, momentum can be controlled by beta hyperparameter. For regularization I use L2 in every layer, controlled by reg_lambda hyperparameter. Parameter "r" controls non-linearity of activation funcions, for further details about this group of functions take a look into this repositary: https://github.com/marek-kan/Bionodal-root-units

  • test.py: - runs simple test on sci-kit learn boston housing dataset. This NN beats my Liner Regression model by ~1 MAE (~2.2 vs ~3.3).
  • bru_regressor.py - contains a model definition.

About

Implementation of Artificial Neural Network with BRU activation functions in Python.

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages