You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi
Is it possible (does it make sense ?) to have an activation threshold (like axion hill) for tanh or sigmoid ? The get_output_for method in Conv layer will always return an output (i.e a neuron will fire unless it is Relu) - right ? (however small it is).
The text was updated successfully, but these errors were encountered:
Activation functions are defined in lasagne.nonlinearities. It should be easy enough to try this out and make a tanh version with a threshold. Feel free to reopen if you think this is a nolearn issue.
Hi
Is it possible (does it make sense ?) to have an activation threshold (like axion hill) for tanh or sigmoid ? The get_output_for method in Conv layer will always return an output (i.e a neuron will fire unless it is Relu) - right ? (however small it is).
The text was updated successfully, but these errors were encountered: