Simple multi layer perceptron application using feed forward back propagation algorithm
-
Updated
Nov 19, 2017 - Python
Simple multi layer perceptron application using feed forward back propagation algorithm
Developed Neural Network (NN) having one hidden layer, two hidden layers and four hidden layers, besides the input and output layers. Tested with Sigmoid, tanh and ReLu activation function. Used Scikit learn for pre-processing data.
GAAF implementation on Keras
Feed Forward Neural Network to classify the FB post likes in classes of low likes or moderate likes or high likes, back propagtion is implemented with decay learning rate method
Neural network with 2 hidden layers
Neural Network from scratch without any machine learning libraries
2nd Project of Course 'Machine Learning' of the SMARTNET programme. Taken at the National and Kapodistrian University of Athens.
A neural network (NN) having two hidden layers is implemented, besides the input and output layers. The code gives choise to the user to use sigmoid, tanh orrelu as the activation function. Prediction accuracy is computed at the end.
Multi-Layer Neural Network
Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
Comparison of common activation functions on MNIST dataset using PyTorch.
A data classification using MLP
Deep Learning
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
Add a description, image, and links to the tanh topic page so that developers can more easily learn about it.
To associate your repository with the tanh topic, visit your repo's landing page and select "manage topics."