Skip to content

lessw2020/FTSwish

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 

Repository files navigation

FTSwish

Flattened Threshold Swish Activation function - PyTorch implementation

PyTorch implementation of Flattened, Threshold Swish like activation function for deep learning. The theory was developed in this paper: https://arxiv.org/abs/1812.06247

Added ability for mean shift, adjustable threshold and max value clamping.

FTSwish is:

X>0 = Relu(x) * Sigmoid(x) +T

x<0 = T (default = -.20)

For positive value it mimics swish activation, minus the threshold. For negative values it allows a fixed threshold for < 0 values.

About

Flattened Threshold Swish Activation function - PyTorch implementation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages