A CV toolkit for my papers.
-
Updated
Jun 2, 2024 - Python
A CV toolkit for my papers.
Rethinking the Smaller-Norm-Less-Informative Assumption in Channel Pruning of Convolution Layers https://arxiv.org/abs/1802.00124
Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks
MNIST Classification using Neural Network and Back Propagation. Written in Python and depends only on Numpy
Playground repository to highlight the problem of BatchNorm layers for an blog article
Code to fold batch norm layer of a DNN model in pytorch
This repository contains different implementation of deep learning model using Numpy.
Implementation of a Fully Connected Neural Network, Convolutional Neural Network (CNN), and Recurrent Neural Network (RNN) from Scratch, using NumPy.
MXNet implementation of Filter Response Normalization Layer (FRN) published in CVPR2020
A set of experiments inspired by the paper "Training BatchNorm and Only BatchNorm: On the Expressive Power of Random Features in CNNs" by Jonathan Frankle, David J. Schwab, Ari S. Morcos
As part of a bigger work, this work focuses on implementing MLPs and Batch Normalization with Numpy and Python only.
Batch normalization from scratch on LeNet using tensorflow.keras on mnist dataset. The goal is to learn and characterize batch normalization's impact on the NN performance.
Add a description, image, and links to the batchnorm topic page so that developers can more easily learn about it.
To associate your repository with the batchnorm topic, visit your repo's landing page and select "manage topics."