Skip to content

Ashish-Surve/MNIST_AutoEncoders

Repository files navigation

Introduction

MNIST ("Modified National Institute of Standards and Technology") is the de facto “Hello World” dataset of computer vision. Since its release in 1999, this classic dataset of handwritten images has served as the basis for benchmarking classification algorithms. As new machine learning techniques emerge, MNIST remains a reliable resource for researchers and learners alike.

Problem Statement

In this, we aim to correctly identify digits from a dataset of tens of thousands of handwritten images.

Hardware

  1. 940MX GPU with CUDA 10.1
  2. 8GB RAM at 2133 Hz.
  3. i5 @2.4 Ghz with 4 cores.

Process

The process is divided into 3 parts.

  1. Add Gaussian noise to dataset.
  2. Create Autoencoder to denoise the image.
  3. Create a Dense layer model to identify Digits.

Why Gaussian Noise

  • It helps to prevent GAN attacks.
  • It can be used for dimensionality reduction.
  1. Adding Gaussian Noise

    1. How to add gaussian noise
    2. Here is the link to my the code. Link
  2. Create Autoencoder

    1. Encoder - Input image => CONV => RELU => BN => CONV => RELU => BN => Flatten => Dense
    2. Decoder - Output Encoder image => CONV_TRANSPOSE => RELU => BN => CONV_TRANSPOSE => RELU => BN => CONV_TRANSPOSE => Sigmoid
    3. Autoencoder - Encoder + Decoder
  3. Dense Layers model for prediction

    1. Dense => Dropout => Dense => Dropout => Dense => Dropout => Dense => Dense
    2. Output one hot encoded numbers between 0-9
Property Value
Epochs(Autoencoder) 100
Epochs(Dense) 200
Optimizer(Autoencoder) Adam
Optimizer(Dense) Adam
Property Training Testing
No. of Images 60,000 10,000
Time (Autoencoder) 2 hrs 23 mins
Time (Digit Prediction) 20 mins 4 mins

Loss vs Epoch for Autoencoder

Chart1

Loss vs Epoch for Predictor

Chart2

Updated Model and Performance Metric:

AutoEncoder Sample

Autoencoder Sample

Autoencoder Loss

Autoencoder Loss

Dense Accuracy

Dense Accuracy

Dense Loss

Dense Loss

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages