Skip to content

Repo for storing content on my survey paper about Activation Functions on Deep Neural Networks

Notifications You must be signed in to change notification settings

murilogustineli/Survey-Activation-Functions

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 

Repository files navigation

Survey on recently proposed activation functions for Deep Learning

Repo for storing content on my survey paper about Activation Functions on Deep Neural Networks.

Publication link:

Referenced papers

Paper PDF Link to paper
Agostinelli (2015): Learning Activation Functions to Improve Deep Neural Networks arxiv.org
Apicella (2021): A survey on modern trainable activation functions arxiv.org
Datta (2020): A Survey on Activation Functions and their relation with Xavier and He Normal Initialization arxiv.org
Glorot (2010): Understanding the difficulty of training deep feedforward neural networks proceedings.mlr.press
Gu (2018): Recent Advances in Convolutional Neural Networks sciencedirect.com
Han (1995): The Influence of the Sigmoid Function Parameters on the Speed of Backpropagation Learning springer.com
Hecht-Nielsen (1992): Theory of the Backpropagation Neural Network sciencedirect.com
Hornik (1989): Multilayer Feedforward Networks are Universal Approximators sciencedirect.com
LeCun (1989): Backpropagation Applied to Handwritten Zip Code Recognition IEEE.org
LeCun (1989): Handwritten Digit Recognition with a Back-Propagation Network proceedings.neurips.cc
LeCun (1998): Gradient-Based Learning Applied to Document Recognition IEEE.org
LeCun (2012): Efficient BackProp springer.com
LeCun (2015): Deep Learning nature.com
Misra (2020): Mish: A Self Regularized Non-Monotonic Activation Function arxiv.org
Noel (2021): Growing Cosine Unit: A Novel Oscillatory Activation Function That Can Speedup Training and Reduce Parameters in Convolutional Neural Networks arxiv.org
Noel (2022): Biologically Inspired Oscillating Activation Functions Can Bridge the Performance Gap between Biological and Artificial Neurons arxiv.org
Ramachandran (2017): Searching for Activation Functions arxiv.org
Russakovsky (2014): ImageNet Large Scale Visual Recognition Challenge arxiv.org
Schmidhuber (2014): Deep Learning in Neural Networks: An Overview arxiv.org

About

Repo for storing content on my survey paper about Activation Functions on Deep Neural Networks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published