You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@ivanmkc
Perplexity ranges between zero and inf because the exponent can be negative (The sum of negative log likelihoods).
Check out the following blog post for a better understanding. Perplexity of fixed-length models.
evaluate/metrics/perplexity/README.md
Line 60 in 8dfe057
perplexity = e**(sum(losses) / num_tokenized_tokens)
If sum(losses) = 0, then perplexity = 1.
The text was updated successfully, but these errors were encountered: