Skip to content

Christophe-pere/Activation_functions

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Activation functions for Neural Networks

This repository contains the notebook with implentation of most common activation functions.

Content

  • Binary
  • Linear
  • Sigmoid
  • Tanh
  • ReLU
  • Leaky ReLU (LReLU)
  • Parametric ReLU (PReLU)
  • Exponential Linear Unit (eLU)
  • ReLU-6
  • Softplus
  • Softsign
  • Softmax
  • Gaussian
  • Swish

This notebook is linked with the article What is activation function ? on medium.com

About

Contains notebook with the implementation of different activation functions

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published