Skip to content

Double log_softmax on logit in rnnt_loss_tf function #153

@thanatl

Description

@thanatl

In rnnt_loss_tf function, you take the tf.nn.log_softmax(logits) on line 238, and you take the tf.nn.log_softmax(logits) again in the compute_rnnt_loss_and_grad_helper on line 159

is this intentional or an error?

reference file:
https://github.com/TensorSpeech/TensorFlowASR/blob/main/tensorflow_asr/losses/rnnt_losses.py#L150

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions