Skip to content

A curated list of awesome deep learning techniques for deep neural networks training, testing, optimization, regularization etc.

Notifications You must be signed in to change notification settings

minar09/awesome-deep-learning-techniques

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 

Repository files navigation

awesome-deep-learning-techniques

A curated list of awesome deep learning techniques for deep neural networks training, testing, optimization, regularization etc.

Weight Initialization

  • Xavier Initialization
  • He Initialization

Data/Input Processing

  • Input pipelining
  • Queues

Data Augmentation

  • Random cropping
  • Random padding
  • Random horizontal flipping
  • Random RGB color shifting

Decreasing/Changing Learning Rate

  • Learning rate decay
  • Cyclic learning rate

Regularization

  • Weight decay a. L2 loss b. L1 loss
  • Dropout

Optimization/Gradient Descent

  • Adam Optimizer
  • SGD with momentum
  • Nesterov Accelerated Gradient (NAG)
  • Stochastic Gradient Descent (SGD)

Normalization

  • Batch Normalization
  • Local Response Normalization (LRN)

Activation Function

  • ReLU
  • Sigmoid

Segmentation

  • Fully Convolutional models
  • Conditional Random Fields (CRF)
  • Skip Connections/ Fusions
  • Upsampling/ Transpose Convolutions
  • Atrous/Dilated Convolutions
  • Multi-scale inputs/nets with weights
  • Attention to scale
  • Pixel wise cross entropy loss
  • Dataset and annotations