Implementing Artificial Neural Network training process in Python
-
Updated
Jun 8, 2020 - Python
Implementing Artificial Neural Network training process in Python
implementation of neural network from scratch only using numpy (Conv, Fc, Maxpool, optimizers and activation functions)
A simple neural network with backpropagation used to recognize ASCII coded characters
a multilayer neural net written in go
Classification model using sigmoid activation with unknown class data
This is a Feed-Forward Neural Network with back-propagation written in C++ from scratch with no external libraries.
Neural Network from scratch without any machine learning libraries
Minimal, limited in features, deep learning library, created with the goal of understanding more of the field.
This program implements logistic regression from scratch using the gradient descent algorithm in Python to predict whether customers will purchase a new car based on their age and salary.
🤖 Artificial intelligence (neural network) proof of concept to solve the classic XOR problem. It uses known concepts to solve problems in neural networks, such as Gradient Descent, Feed Forward and Back Propagation.
Image Compression using one hidden layer Neural Network
Фреймворк глубоко обучения на Numpy, написанный с целью изучения того, как все работает под "капотом".
Deep Forward Architecture From Scratch
Applying neural network with adam optimizer on heart failure clinical records dataset to compare test errors of sigmoid, tanh, and relu activation functions
It is small Web app for Visualization of Activation Function
Deep-Learning neural network to analyze and classify the success of charitable donations.
Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
NU Bootcamp Module 21
Add a description, image, and links to the sigmoid-activation topic page so that developers can more easily learn about it.
To associate your repository with the sigmoid-activation topic, visit your repo's landing page and select "manage topics."