Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
-
Updated
May 29, 2021 - Python
Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
Feed Forward Neural Network to classify the FB post likes in classes of low likes or moderate likes or high likes, back propagtion is implemented with decay learning rate method
GAAF implementation on Keras
2nd Project of Course 'Machine Learning' of the SMARTNET programme. Taken at the National and Kapodistrian University of Athens.
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
A data classification using MLP
Comparison of common activation functions on MNIST dataset using PyTorch.
A neural network (NN) having two hidden layers is implemented, besides the input and output layers. The code gives choise to the user to use sigmoid, tanh orrelu as the activation function. Prediction accuracy is computed at the end.
Neural network with 2 hidden layers
Multi-Layer Neural Network
Deep Learning
Neural Network from scratch without any machine learning libraries
Developed Neural Network (NN) having one hidden layer, two hidden layers and four hidden layers, besides the input and output layers. Tested with Sigmoid, tanh and ReLu activation function. Used Scikit learn for pre-processing data.
Simple multi layer perceptron application using feed forward back propagation algorithm
Add a description, image, and links to the tanh topic page so that developers can more easily learn about it.
To associate your repository with the tanh topic, visit your repo's landing page and select "manage topics."