Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: add docstrings for EvaluationRunResult #7885

Merged
merged 1 commit into from
Jun 19, 2024
Merged

docs: add docstrings for EvaluationRunResult #7885

merged 1 commit into from
Jun 19, 2024

Conversation

masci
Copy link
Contributor

@masci masci commented Jun 18, 2024

Related Issues

Proposed Changes:

  • Add docstrings for EvaluationRunResult
  • Move the abstract class to a dedicated module

How did you test it?

Notes for the reviewer

Checklist

@masci masci added type:documentation Improvements on the docs ignore-for-release-notes PRs with this flag won't be included in the release notes. labels Jun 18, 2024
@masci masci requested a review from a team as a code owner June 18, 2024 10:29
@masci masci requested review from vblagoje and removed request for a team June 18, 2024 10:29
@coveralls
Copy link
Collaborator

coveralls commented Jun 18, 2024

Pull Request Test Coverage Report for Build 9563422604

Details

  • 0 of 0 changed or added relevant lines in 0 files are covered.
  • 5 unchanged lines in 1 file lost coverage.
  • Overall coverage increased (+0.005%) to 89.751%

Files with Coverage Reduction New Missed Lines %
evaluation/eval_run_result.py 5 91.53%
Totals Coverage Status
Change from base Build 9561171587: 0.005%
Covered Lines: 6927
Relevant Lines: 7718

💛 - Coveralls

@@ -3,7 +3,8 @@ loaders:
search_path: [../../../haystack/evaluation]
modules:
[
"eval_run_result"
"base",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice 👍🏽

in my opinion, this is the approach to organizing abstractions - we could think about using it on every bit of code that has abstractions

Copy link
Contributor Author

@masci masci Jun 19, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Slightly unrelated, but we should refrain from introducing abstractions, like in this case. I would have just gone with EvaluationRunResult and added an abstraction later if and when needed, possibly with a protocol, not an abstract class. In that case, we wouldn't have had this PR :)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree that the abstractions here was too much and we can always add them later if needed.

@davidsbatista
Copy link
Contributor

@masci : thanks for removing the noqa: D102 👍🏽 - probably there are other places where we need to do the same

Copy link
Contributor

@davidsbatista davidsbatista left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM - thanks @masci

@masci masci merged commit 7c31d5f into main Jun 19, 2024
24 checks passed
@masci masci deleted the massi/7829 branch June 19, 2024 09:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ignore-for-release-notes PRs with this flag won't be included in the release notes. type:documentation Improvements on the docs
Projects
None yet
Development

Successfully merging this pull request may close these issues.

EvaluationRunResult.score_result() is not documented in API reference
3 participants