Skip to content

Commit

Permalink
Fix Perplexity docs (#2391)
Browse files Browse the repository at this point in the history
Fix indentation
  • Loading branch information
AtomicVar committed Feb 17, 2024
1 parent 4c999b8 commit 7e2ebff
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions src/torchmetrics/text/perplexity.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,8 @@ class Perplexity(Metric):
As input to ``forward`` and ``update`` the metric accepts the following input:
- ``preds`` (:class:`~torch.Tensor`): Logits or a unnormalized score assigned to each token in a sequence with shape
[batch_size, seq_len, vocab_size], which is the output of a language model. Scores will be normalized internally
using softmax.
[batch_size, seq_len, vocab_size], which is the output of a language model. Scores will be normalized internally
using softmax.
- ``target`` (:class:`~torch.Tensor`): Ground truth values with a shape [batch_size, seq_len]
As output of ``forward`` and ``compute`` the metric returns the following output:
Expand Down

0 comments on commit 7e2ebff

Please sign in to comment.