Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

accuracy metric is no longer supported #1111

Open
AmitMY opened this issue May 22, 2024 · 0 comments
Open

accuracy metric is no longer supported #1111

AmitMY opened this issue May 22, 2024 · 0 comments

Comments

@AmitMY
Copy link

AmitMY commented May 22, 2024

Training translation model with VQ outputs, perplexity is not a great fit for dev set evaluation, and bleu or others is also not great.
What I really want to use is accuracy - that gets the accuracy for all predicted factors - but even if just the main token, that would be great.

in constants, a metric named accuracy is declared and set up, but it is never reported anywhere.

AssertionError: Early stopping metric accuracy not found in validation metrics.

I would love to see accuracy supported again, and not removed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant