Uses vanilla backpropagation to train a basic multi-layer network to classify digits
-
Updated
Nov 14, 2018 - MATLAB
Uses vanilla backpropagation to train a basic multi-layer network to classify digits
A Machine Learning project that uses EM and Bernoulli mixes to classify digits
Classificador MNIST
This repository contains the code to reproduce all of the results in our paper: Making Learners (More) Monotone, T J Viering, A Mey, M Loog, IDA 2020.
The repository implements the a simple Convolutional Neural Network (CNN) from scratch for image classification. I experimented with it on MNIST digits and COIL object dataset.
Verification of a VAE and SegNet using NNV
Temele la Metode Numerice
UB Computer Vision
This repository encloses the programmatic part of the research into equivalence of Hebbian learning and the SVN formalism, exploring hypothesis brought forward in [On the equivalence of Hebbian learning and the SVM formalism [Nowotny, T and Huerta, R]
This is a basic project aimed at recognizing hand-written digits using Matlab.
Just another BP simulation.
A simple neural network implementation for MNIST dataset
A Neural Network from scratch (Extreme Learning Machine), trained on MNIST (97% accuracy).
Test my MNIST data using kNN
Add a description, image, and links to the mnist topic page so that developers can more easily learn about it.
To associate your repository with the mnist topic, visit your repo's landing page and select "manage topics."