Dropout vs. batch normalization: effect on accuracy, training and inference times - code for the paper
-
Updated
Dec 19, 2022 - TeX
Dropout vs. batch normalization: effect on accuracy, training and inference times - code for the paper
Machine Learning Practical - Coursework 2 Report: Analysing problems with the VGG deep neural network architectures (with 8 and 38 hidden layers) on the CIFAR100 dataset by monitoring gradient flow during training. And exploring solutions using batch normalization and residual connections.
Add a description, image, and links to the batch-normalization topic page so that developers can more easily learn about it.
To associate your repository with the batch-normalization topic, visit your repo's landing page and select "manage topics."