Skip to content

Customized PyTorch implementation of LiSHT (linear scaled hyperbolic tangent) activation function for deep learning

License

Notifications You must be signed in to change notification settings

lessw2020/LightRelu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 

Repository files navigation

LightRelu

Customized PyTorch implementation of LiSHT (linear scaled hyperbolic tangent) activation function for deep learning, with mean shift and clamping.

Original paper here:

#LiSHT: Non-Parametric Linearly Scaled Hyperbolic Tangent Activation Function for Neural Networks https://arxiv.org/abs/1901.05894

Activation map comparison:

MNIST - Relu vs Lisht:

LightRelu = customized LiSHT in PyTorch, with mean shift and clamp:

I implemented using Pytorch and wrapped it with a clamp and mean shift.(.46 and 7.5).
More testing in progress, but so far looks very promising!
Note - cut your learning rates in half vs ReLU, it learns very rapidly.

Comparisons of LightRelu vs ReLU and General Relu

(GeneralRelu is an upcoming Relu with leakiness, mean shift and clamp):

ReLU:

LightRelU:

Histogram of activations (smoother is better) - General ReLU vs LightRelu...and in last place, ReLU:

GeneralReLU:

LightRelU:

ReLU:

About

Customized PyTorch implementation of LiSHT (linear scaled hyperbolic tangent) activation function for deep learning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages