objax.functional
objax.functional
Due to the large number of APIs in this section, we organized it into the following sub-sections:
celu elu leaky_relu log_sigmoid log_softmax logsumexp relu selu sigmoid softmax softplus tanh
celu
elu
leaky_relu
log_sigmoid
log_softmax
logsumexp
relu
selu
sigmoid
softmax
softplus
tanh
average_pool_2d batch_to_space2d channel_to_space2d max_pool_2d space_to_batch2d space_to_channel2d
average_pool_2d
For a definition of pooling, including examples see Pooling Layer.
batch_to_space2d
channel_to_space2d
max_pool_2d
For a definition of pooling, including examples see Pooling Layer.
space_to_batch2d
space_to_channel2d
dynamic_slice flatten interpolate one_hot pad scan stop_gradient top_k rsqrt upsample_2d upscale_nn
dynamic_slice
flatten
interpolate
one_hot
pad
scan
stop_gradient
top_k
rsqrt
upsample_2d
upscale_nn
objax.functional.divergence
kl
kl
The ϵ term is added to ensure that neither p
nor q
are zero.
objax.functional.loss
cross_entropy_logits cross_entropy_logits_sparse l2 mean_absolute_error mean_squared_error mean_squared_log_error sigmoid_cross_entropy_logits
cross_entropy_logits
Calculates the cross entropy loss, defined as follows:
$$\begin{aligned} \begin{eqnarray} l(y,\hat{y}) & = & - \sum_{j=1}^{q} y_j \log \frac{e^{o_j}}{\sum_{k=1}^{q} e^{o_k}} \nonumber \\\ & = & \log \sum_{k=1}^{q} e^{o_k} - \sum_{j=1}^{q} y_j o_j \nonumber \end{eqnarray} \end{aligned}$$ where ok are the logits and yk are the labels.
cross_entropy_logits_sparse
l2
Calculates the l2 loss, as:
mean_absolute_error
mean_squared_error
sigmoid_cross_entropy_logits
objax.functional.parallel
pmax pmean pmin psum
pmax
pmean
pmin
psum