Tensorflow Simplified: Linear and Sigmoid Layers, Forward and Back Prop, Stochastic Gradient Descent
-
Updated
Aug 1, 2017 - Python
Tensorflow Simplified: Linear and Sigmoid Layers, Forward and Back Prop, Stochastic Gradient Descent
Machine Learning//NN: Sigmoid function
Three layer perceptron. Generates varied character images from scratch so does not depend on an existing dataset, like MNIST. Can train for an arbitrarily long time due to the open ended nature of the dataset. Contains a basic user interface.
A comprehensive approach on recognizing emotion (sentiment) from a certain tweet. Supervised machine learning.
Simple multi layer perceptron application using feed forward back propagation algorithm
• Trained the network for MNIST dataset • Implemented neural network on MNIST dataset by using Sigmoid, ReLU, ELU as the activation function. • Analyzed network’s running time, error rate, efficiency and accuracy.
A classifier to differentiate between Cat and Non-Cat Images
Linear Regression Model, Gradient Descent Learning along with various basis funtions such siogmoidal, Gaussian, p-Norms, L2-regulariztion, GOAL: - To predict Facebook's post likes
Feed Forward Neural Network to classify the FB post likes in classes of low likes or moderate likes or high likes, back propagtion is implemented with decay learning rate method
Neural network with 2 hidden layers
A single layer Neural Network i.e Perceptron which learns the pattern in given binary input value.
Neural Network with functions for forward propagation, error calculation and back propagation is built from scratch and is used to analyse the IRIS dataset.
A simple neural network with backpropagation used to recognize ASCII coded characters
2nd Project of Course 'Machine Learning' of the SMARTNET programme. Taken at the National and Kapodistrian University of Athens.
Implementing Artificial Neural Network training process in Python
mnist data training and testing with back propagation
University assignment - Neural network with sigmoid and ReLu activation functions (python).
A simple, single-layer, neural network
🏆 A Comparative Study on Handwritten Digits Recognition using Classifiers like K-Nearest Neighbours (K-NN), Multiclass Perceptron/Artificial Neural Network (ANN) and Support Vector Machine (SVM) discussing the pros and cons of each algorithm and providing the comparison results in terms of accuracy and efficiecy of each algorithm.
Add a description, image, and links to the sigmoid-function topic page so that developers can more easily learn about it.
To associate your repository with the sigmoid-function topic, visit your repo's landing page and select "manage topics."