To make TensorLayerX simple, we minimize the number of activation functions as much as we can. So we encourage you to use Customizes activation function. For parametric activation, please read the layer APIs.
Customizes activation function in TensorLayerX is very easy. The following example implements an activation that multiplies its input by 2.
from tensorlayerx.nn import Module
class DoubleActivation(Module):
def __init__(self):
pass
def forward(self, x):
return x * 2
double_activation = DoubleActivation()
For more complex activation, TensorFlow(MindSpore, PaddlePaddle, PyTorch) API will be required.
.. automodule:: tensorlayerx.nn.activation
.. autosummary:: ELU PRelu PRelu6 PTRelu6 ReLU ReLU6 Softplus LeakyReLU LeakyReLU6 LeakyTwiceRelu6 Ramp Swish HardTanh Tanh Sigmoid Softmax Mish LogSoftmax HardSigmoid Hardswish
.. autoclass:: ELU
.. autoclass:: PRelu
.. autoclass:: PRelu6
.. autoclass:: PTRelu6
.. autoclass:: ReLU
.. autoclass:: ReLU6
.. autoclass:: Softplus
.. autoclass:: LeakyReLU
.. autoclass:: LeakyReLU6
.. autoclass:: LeakyTwiceRelu6
.. autoclass:: Ramp
.. autoclass:: Swish
.. autoclass:: HardTanh
.. autoclass:: Mish
.. autoclass:: Tanh
.. autoclass:: Sigmoid
.. autoclass:: Softmax
.. autoclass:: LogSoftmax
.. autoclass:: HardSigmoid
.. autoclass:: Hardswish
See tensorlayerx.nn
.