Skip to content

Latest commit

 

History

History
95 lines (74 loc) · 2.32 KB

losses.rst

File metadata and controls

95 lines (74 loc) · 2.32 KB

API - Losses

To make TensorLayerX simple, we minimize the number of cost functions as much as we can. For more complex loss function, TensorFlow(MindSpore, PaddlePaddle, PyTorch) API will be required.

Note

Please refer to Getting Started for getting specific weights for weight regularization.

tensorlayerx.losses

softmax_cross_entropy_with_logits sigmoid_cross_entropy binary_cross_entropy mean_squared_error normalized_mean_square_error absolute_difference_error dice_coe dice_hard_coe iou_coe cross_entropy_seq cross_entropy_seq_with_mask cosine_similarity li_regularizer lo_regularizer maxnorm_regularizer maxnorm_o_regularizer maxnorm_i_regularizer

Softmax cross entropy

softmax_cross_entropy_with_logits

Sigmoid cross entropy

sigmoid_cross_entropy

Binary cross entropy

binary_cross_entropy

Mean squared error (L2)

mean_squared_error

Normalized mean square error

normalized_mean_square_error

Absolute difference error (L1)

absolute_difference_error

Dice coefficient

dice_coe

Hard Dice coefficient

dice_hard_coe

IOU coefficient

iou_coe

Cross entropy for sequence

cross_entropy_seq

Cross entropy with mask for sequence

cross_entropy_seq_with_mask

Cosine similarity

cosine_similarity

Regularization functions

For tf.nn.l2_loss, tf.contrib.layers.l1_regularizer, tf.contrib.layers.l2_regularizer and tf.contrib.layers.sum_regularizer, see tensorflow API. Maxnorm ^^^^^^^^^^ .. autofunction:: maxnorm_regularizer

Special

li_regularizer

lo_regularizer

maxnorm_o_regularizer

maxnorm_i_regularizer