Skip to content

Latest commit

 

History

History
35 lines (24 loc) · 1.02 KB

README.md

File metadata and controls

35 lines (24 loc) · 1.02 KB

af Build Status Go Report Card GoDoc

Activation functions for neural networks.

These activation functions are included:

  • Swish (x / (1 + exp(-x)))
  • Sigmoid (1 / (1 + exp(-x)))
  • SoftPlus (log(1 + exp(x)))
  • Gaussian01 (exp(-(x * x) / 2.0))
  • Sin (math.Sin(math.Pi * x))
  • Cos (math.Cos(math.Pi * x))
  • Linear (x)
  • Inv (-x)
  • ReLU (x >= 0 ? x : 0)
  • Squared (x * x)

These math functions are included just for convenience:

  • Abs (math.Abs)
  • Tanh (math.Tanh)

One functions that takes two arguments is also included:

  • PReLU (x >= 0 ? x : x * a)

Requirements

  • Go 1.11 or later.

General information