Skip to content
This repository has been archived by the owner on Nov 22, 2022. It is now read-only.

changed lm metric reporting #765

Closed

Conversation

shreydesai
Copy link
Contributor

Summary:
Changed language model metric reporting to aggregate perplexity information directly. The previous calculation aggregated logits at each evaluation step, which could potentially be large tensors given large vocabulary sizes.

Future diff will merge this metric reporter with the BERT metric reporter, which does a similar calculation sans real-time metric reporting.

Differential Revision: D16189934

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Jul 10, 2019
Summary:
Pull Request resolved: facebookresearch#765

Changed language model metric reporting to aggregate perplexity information directly. The previous calculation aggregated logits at each evaluation step, which could potentially be large tensors given large vocabulary sizes.

Future diff will merge this metric reporter with the BERT metric reporter, which does a similar calculation sans real-time metric reporting.

Differential Revision: D16189934

fbshipit-source-id: fc92762b1abaa3b98d8bcecd9335de2098251d20
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in c6c952e.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants