Skip to content
/ af Public

⚡ Activation functions for neural networks

License

Notifications You must be signed in to change notification settings

xyproto/af

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

af Build Status Go Report Card GoDoc

Activation functions for neural networks.

These activation functions are included:

  • Swish (x / (1 + exp(-x)))
  • Sigmoid (1 / (1 + exp(-x)))
  • SoftPlus (log(1 + exp(x)))
  • Gaussian01 (exp(-(x * x) / 2.0))
  • Sin (math.Sin(math.Pi * x))
  • Cos (math.Cos(math.Pi * x))
  • Linear (x)
  • Inv (-x)
  • ReLU (x >= 0 ? x : 0)
  • Squared (x * x)

These math functions are included just for convenience:

  • Abs (math.Abs)
  • Tanh (math.Tanh)

One functions that takes two arguments is also included:

  • PReLU (x >= 0 ? x : x * a)

Requirements

  • Go 1.11 or later.

General information

About

⚡ Activation functions for neural networks

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages