MNIST classification using Multi-Layer Perceptron (MLP) with 2 hidden layers. Some weight-initializers and batch-normalization are implemented.
-
Updated
Jan 20, 2017 - Python
MNIST classification using Multi-Layer Perceptron (MLP) with 2 hidden layers. Some weight-initializers and batch-normalization are implemented.
Using advanced deep learning techniques on the MNIST dataset. Over 98% validation set accuracy.
MXNet Code For Demystifying Neural Style Transfer (IJCAI 2017)
TensorFlow implementation of real-time style transfer using feed-forward generation. This builds on the original style-transfer algorithm and allows for common personal computers to transform images.
Comparison of various weight and bias initializers
Implementation of different networks for MNIST
Library which can be used to build feed forward NN, Convolutional Nets, Linear Regression, and Logistic Regression Models.
ImageNet pre-trained models with batch normalization for the Caffe framework
ConvNet in Tensorflow to study the effect of Batch Normalization on the CIFAR-10 dataset
Why Batch Normalization Works so Well (best peer-reviewed project at MLDS, 2017 Spring)
Deep learning models in Python
A Tensorflow implementation of the models described in the paper "Efficient Deep Learning for Stereo Matching"
An image recognition/object detection model that detects handwritten digits and simple math operators. The output of the predicted objects (numbers & math operators) is then evaluated and solved.
An implementation of the technique of batch normalization on a feed forward neural network.
MNIST classification using Convolutional NeuralNetwork. Various techniques such as data augmentation, dropout, batchnormalization, etc are implemented.
Batch Normalization is technique to improve training a Neural Network by reducing Covariant Shift and this repository contains experiments pertinent to the White Paper.
cifar10 classification based on alexnet and vgg16 using TensorFlow
Adaptive Affinity Fields for Semantic Segmentation
CNN (pytorch ver.) (In progress)
Add a description, image, and links to the batch-normalization topic page so that developers can more easily learn about it.
To associate your repository with the batch-normalization topic, visit your repo's landing page and select "manage topics."