Skip to content

Latest commit

 

History

History
166 lines (93 loc) · 5.88 KB

types.rst

File metadata and controls

166 lines (93 loc) · 5.88 KB

Note

The following cart or pol means either type A (cartesian) or type B (polar) according to [CIT2003-KUROE] notation.

TYPE A: Cartesian form

.. py:method:: cart_sigmoid(z)

        Called with :code:`'cart_sigmoid'` string.

        Applies the `sigmoid <https://www.tensorflow.org/api_docs/python/tf/keras/activations/sigmoid>`_ function to both the real and imag part of z.

        .. math::

                \frac{1}{1 + e^{-x}} + j  \frac{1}{1 + e^{-y}}

        where

        .. math::

                z = x + j y

    :param z: Tensor to be used as input of the activation function
    :return: Tensor result of the applied activation function

.. py:method:: cart_elu(x, alpha=0.1)

        Applies the `Exponential linear unit <https://www.tensorflow.org/api_docs/python/tf/keras/activations/elu>`_. To both the real and imaginary part of z.

        .. math::

                x if x > 0 and alpha * (exp(x)-1) if x < 0

    :param z: Input tensor.
    :param alpha: A scalar, slope of negative section.
    :return: Tensor result of the applied activation function

.. py:method:: cart_exponential(z)

        Exponential activation function. Applies to both the real and imag part of z the `exponential activation <https://www.tensorflow.org/api_docs/python/tf/keras/activations/exponential>`_:


        .. math::
                e^x


    :param z: Input tensor.
    :return: Tensor result of the applied activation function

.. py:method:: cart_hard_sigmoid(z)

        Applies the `hard Sigmoid <https://www.tensorflow.org/api_docs/python/tf/keras/activations/hard_sigmoid>`_ function to both the real and imag part of z.
    The hard sigmoid function is faster to compute than sigmoid activation.
    Hard sigmoid activation:

        .. math::

                        0               ,\quad x < -2.5 \\
            1               ,\quad x > 2.5 \\
            0.2 * x + 0.5   ,\quad -2.5 <= x <= 2.5


    :param z: Input tensor.
    :return: Tensor result of the applied activation function

.. py:method:: cart_relu(z, , alpha=0.0, max_value=None, threshold=0)

        Applies `Rectified Linear Unit <https://www.tensorflow.org/api_docs/python/tf/keras/activations/relu>`_ to both the real and imag part of z.

    The relu function, with default values, it returns element-wise max(x, 0).

    Otherwise, it follows:

        .. math::

                        f(x) = \textrm{max_value}, \quad \textrm{for} \quad x >= \textrm{max_value} \\
            f(x) = x, \quad \textrm{for} \quad \textrm{threshold} <= x < \textrm{max_value} \\
            f(x) = \alpha * (x - \textrm{threshold}), \quad \textrm{otherwise} \\

    :param z: Input tensor.
    :return: Tensor result of the applied activation function

.. py:method:: cart_leaky_relu(z, alpha=0.2, name=None)

        Applies `Leaky Rectified Linear Unit <https://www.tensorflow.org/api_docs/python/tf/nn/leaky_relu>`_ [CIT2013-MAAS]_ (`source <http://robotics.stanford.edu/~amaas/papers/relu_hybrid_icml2013_final.pdf>`_) to both the real and imag part of z.

    :param z: Input tensor.
    :param alpha: Slope of the activation function at x < 0. Default: 0.2
    :param name: A name for the operation (optional).
    :return: Tensor result of the applied activation function

.. py:method:: cart_selu(z)

    Applies `Scaled Exponential Linear Unit (SELU) <https://www.tensorflow.org/api_docs/python/tf/keras/activations/selu>`_ [CIT2017-KLAMBAUER]_ (`source <https://arxiv.org/abs/1706.02515>`_) to both the real and imag part of z.



    The scaled exponential unit activation:

    .. math::
        \textrm{scale} * \textrm{elu}(x, \alpha)


    :param z: Input tensor.
    :return: Tensor result of the applied activation function

.. py:method:: cart_softplus(z):

    Applies `Softplus <https://www.tensorflow.org/api_docs/python/tf/keras/activations/softplus>`_ activation function to both the real and imag part of z.
    The Softplus function:

    .. math::
        log(e^x + 1)

    :param z: Input tensor.
    :return: Tensor result of the applied activation function

.. py:method:: cart_softsign(z):

    Applies `Softsign <https://www.tensorflow.org/api_docs/python/tf/keras/activations/softsign>`_ activation function to both the real and imag part of z.
    The softsign activation:

    .. math::

        \frac{x}{\lvert x \rvert + 1}

    :param z: Input tensor.
    :return: Tensor result of the applied activation function

.. py:method:: cart_tanh(z)

        Applies `Hyperbolic Tangent <https://www.tensorflow.org/api_docs/python/tf/keras/activations/tanh>`_ (tanh) activation function to both the real and imag part of z.

    The tanh activation:

        .. math::

                tanh(x) = \frac{sinh(x)}{cosh(x)} = \frac{e^x - e^{-x}}{e^x + e^{-x}}.

    The derivative if tanh is computed as  :math:`1 - tanh^2` so it should be fast to compute for backprop.

    :param z: Input tensor.
    :return: Tensor result of the applied activation function


TYPE B: Polar form

.. py:method:: pol_selu(z)

    Applies `Scaled Exponential Linear Unit (SELU) <https://www.tensorflow.org/api_docs/python/tf/keras/activations/selu>`_ [CIT2017-KLAMBAUER]_ (`source <https://arxiv.org/abs/1706.02515>`_) to the absolute value of z, keeping the phase unchanged.

    The scaled exponential unit activation:

    .. math::

        \textrm{scale} * \textrm{elu}(x, \alpha)

    :param z: Input tensor.
    :return: Tensor result of the applied activation function

[CIT2003-KUROE]Kuroe, Yasuaki, Mitsuo Yoshid, and Takehiro Mori. "On activation functions for complex-valued neural networks—existence of energy functions—." Artificial Neural Networks and Neural Information Processing—ICANN/ICONIP 2003. Springer, Berlin, Heidelberg, 2003. 985-992.
[CIT2013-MAAS]
    1. Maas, A. Y. Hannun, and A. Y. Ng, “Rectifier Nonlinearities Improve Neural Network Acoustic Models,” 2013.
[CIT2017-KLAMBAUER]
  1. Klambauer, T. Unterthiner, A. Mayr, and S. Hochreiter, “Self-Normalizing Neural Networks,” ArXiv170602515 Cs Stat, Sep. 2017. Available: http://arxiv.org/abs/1706.02515.