Bootstrapping scores for an uncertainty estimate #95
Labels
enhancement
New feature or request
not planned
We are not planning to solve this issue at the moment
The current implementation of the evaluators only gives a singular score. This makes it hard to see the uncertainty in the scores.
A potential solution is bootstrapping on the document level and recalculating the scores.
The text was updated successfully, but these errors were encountered: