Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bootstrapping scores for an uncertainty estimate #95

Open
KennethEnevoldsen opened this issue Jan 25, 2024 · 0 comments
Open

Bootstrapping scores for an uncertainty estimate #95

KennethEnevoldsen opened this issue Jan 25, 2024 · 0 comments
Labels
enhancement New feature or request not planned We are not planning to solve this issue at the moment

Comments

@KennethEnevoldsen
Copy link
Owner

The current implementation of the evaluators only gives a singular score. This makes it hard to see the uncertainty in the scores.

A potential solution is bootstrapping on the document level and recalculating the scores.

@KennethEnevoldsen KennethEnevoldsen added enhancement New feature or request not planned We are not planning to solve this issue at the moment no-stale For issues that should not go stale labels Jan 25, 2024
@KennethEnevoldsen KennethEnevoldsen removed the no-stale For issues that should not go stale label Feb 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request not planned We are not planning to solve this issue at the moment
Projects
None yet
Development

No branches or pull requests

1 participant