.. currentmodule:: objax.functional
.. currentmodule:: objax.functional
Due to the large number of APIs in this section, we organized it into the following sub-sections:
.. autosummary:: celu elu leaky_relu log_sigmoid log_softmax logsumexp relu selu sigmoid softmax softplus tanh
.. autofunction:: celu
.. autofunction:: elu
.. autofunction:: leaky_relu
.. autofunction:: log_sigmoid
.. autofunction:: log_softmax
.. autofunction:: logsumexp
.. autofunction:: relu
.. autofunction:: selu
.. autofunction:: sigmoid
.. autofunction:: softmax
.. autofunction:: softplus
.. autofunction:: tanh
.. autosummary:: average_pool_2d batch_to_space2d channel_to_space2d max_pool_2d space_to_batch2d space_to_channel2d
.. autofunction:: average_pool_2d For a definition of pooling, including examples see `Pooling Layer <https://cs231n.github.io/convolutional-networks/#pool>`_.
.. autofunction:: batch_to_space2d
.. autofunction:: channel_to_space2d
.. autofunction:: max_pool_2d For a definition of pooling, including examples see `Pooling Layer <https://cs231n.github.io/convolutional-networks/#pool>`_.
.. autofunction:: space_to_batch2d
.. autofunction:: space_to_channel2d
.. autosummary:: dynamic_slice flatten one_hot pad stop_gradient top_k rsqrt upscale_nn
.. autofunction:: dynamic_slice
.. autofunction:: flatten
.. autofunction:: one_hot
.. autofunction:: pad
.. autofunction:: stop_gradient
.. autofunction:: top_k
.. autofunction:: rsqrt
.. autofunction:: upscale_nn
.. currentmodule:: objax.functional.divergence
.. autosummary:: kl
.. autofunction:: kl .. math:: kl(p,q) = p \cdot \log{\frac{p + \epsilon}{q + \epsilon}} The :math:`\epsilon` term is added to ensure that neither :code:`p` nor :code:`q` are zero.
.. currentmodule:: objax.functional.loss
.. autosummary:: cross_entropy_logits cross_entropy_logits_sparse l2 sigmoid_cross_entropy_logits
.. autofunction:: cross_entropy_logits
Calculates the cross entropy loss, defined as follows:
\begin{eqnarray} l(y,\hat{y}) & = & - \sum_{j=1}^{q} y_j \log \frac{e^{o_j}}{\sum_{k=1}^{q} e^{o_k}} \nonumber \\ & = & \log \sum_{k=1}^{q} e^{o_k} - \sum_{j=1}^{q} y_j o_j \nonumber \end{eqnarray}where o_k are the logits and y_k are the labels.
.. autofunction:: cross_entropy_logits_sparse
.. autofunction:: l2
Calculates the l2 loss, as:
l_2 = \frac{\sum_{i} x_{i}^2}{2}
.. autofunction:: sigmoid_cross_entropy_logits
.. currentmodule:: objax.functional.parallel
.. autosummary:: pmax pmean pmin psum
.. autofunction:: pmax
.. autofunction:: pmean
.. autofunction:: pmin
.. autofunction:: psum