Implementation of a simple neural network with one hidden layer and training with backpropagation
-
Updated
Jan 14, 2016 - Python
Implementation of a simple neural network with one hidden layer and training with backpropagation
The Deep Learning exercises provided in DataCamp
Assignments for the Machine Learning course (COL774) at IITD
Backpropogation algorithm implemented in Python 3
Numpy NeuralNetworks with Keras like interface
For Azimuth ACT course
Implementation of Logistic Regression considering a single Neural Network Node
Testing various examples and code for Machine Learning using TensorFlow
This Jupyter Notebook contains a separate class of Neural Network.
A trainable convolutional neural network inside a fragment shader
Pytorch/ without Pytorch implementation of mnist handwriting dataset
All the assignments of DLFA course IIT KGP
Multivariate Classification Using a Feed-Forward Neural Network and Backpropagation.
This gradient decent library has not only a determination function, but it has emotions too, as an add on! (Currently the emotions are anger, surprise and excitement)(By emotions, I JUST MEAN BUILT-IN FUNCTIONS FOR EFFICIENT TRAINING)
A Selection of Artificial Sudoku Solvers
A multi-layer, feedforward neural network (FNN) implementation in Java
Creating a basic NN from scratch to detect numbers from the MNIST dataset using Numpy
Visualization Tool For Keras
Add a description, image, and links to the backpropogation topic page so that developers can more easily learn about it.
To associate your repository with the backpropogation topic, visit your repo's landing page and select "manage topics."