Skip to content

Latest commit

 

History

History
72 lines (54 loc) · 1.57 KB

activation.rst

File metadata and controls

72 lines (54 loc) · 1.57 KB

API - Activations

To make TensorLayer simple, we minimize the number of activation functions as much as we can. So we encourage you to use TensorFlow's function. TensorFlow provides tf.nn.relu, tf.nn.relu6, tf.nn.elu, tf.nn.softplus, tf.nn.softsign and so on. More TensorFlow official activation functions can be found here. For parametric activation, please read the layer APIs.

The shortcut of tensorlayer.activation is tensorlayer.act.

Your activation

Customizes activation function in TensorLayer is very easy. The following example implements an activation that multiplies its input by 2. For more complex activation, TensorFlow API will be required.

def double_activation(x):
    return x * 2

tensorlayer.activation

leaky_relu leaky_relu6 leaky_twice_relu6 ramp swish sign hard_tanh pixel_wise_softmax

Ramp

ramp

Leaky ReLU

leaky_relu

Leaky ReLU6

leaky_relu6

Twice Leaky ReLU6

leaky_twice_relu6

Swish

swish

Sign

sign

Hard Tanh

hard_tanh

Pixel-wise softmax

pixel_wise_softmax

Parametric activation

See tensorlayer.layers.