Skip to content

Commit

Permalink
minor renaming of parameters in sampled_softmax_cross_entropy
Browse files Browse the repository at this point in the history
  • Loading branch information
w4nderlust committed Jul 26, 2020
1 parent bc1446c commit bffadfb
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions ludwig/models/modules/loss_modules.py
Original file line number Diff line number Diff line change
Expand Up @@ -296,7 +296,7 @@ def cross_entropy_sequence_loss(logits, targets, sequence_length):

def sampled_softmax_cross_entropy(
labels,
logits,
last_hidden,
num_classes=1,
decoder_weights=None,
decoder_biases=None,
Expand All @@ -318,7 +318,7 @@ def sampled_softmax_cross_entropy(
weights=tf.transpose(decoder_weights),
biases=decoder_biases,
labels=labels,
inputs=logits,
inputs=last_hidden,
num_sampled=negative_samples,
num_classes=num_classes,
sampled_values=sampled_values)
Expand Down

0 comments on commit bffadfb

Please sign in to comment.