Skip to content

Experimented with different architectures on MNIST dataset using MLPs with different dropouts.

Notifications You must be signed in to change notification settings

sahildigikar15/MLP-Architetures-on-MNIST-dataset

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

MLP-Architetures-on-MNIST-dataset

Screenshot (71) 1] As dropout is increased from 0.2 to 0.5 with same batch size and architecture , number of epoch needed starts increasing
2] After reaching the particular epoch the model overfits.
3] When we dont apply BatchNormalization, no. of epoch becomes 4 after which the model starts overfitting.
4] For 5 Hidden Layers with dropout = 0.5 , epoch reached maximum which is 50.
5] All the test Accuracy is in the range of (98 - 98.5) %
6] For models without BatchNormalization we got less test accuracy of approx 97 %

About

Experimented with different architectures on MNIST dataset using MLPs with different dropouts.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published