Skip to content

Logits and log_probs set to 0 when using cross_entropy #8398

@datduonguva

Description

@datduonguva

In MaskGan, seq2seq_vd, if the training strategy is cross_entropy, all the logits, ,log_probs output of the generator's decoder are set to 0. Why is this the case?

if FLAGS.gen_training_strategy != 'cross_entropy':

@a-dai

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions