Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix metrics examples, add average precision metric #522

Merged
merged 5 commits into from
Dec 6, 2023

Conversation

amrit110
Copy link
Member

@amrit110 amrit110 commented Dec 5, 2023

PR Type ([Feature | Fix | Documentation | Test])

Feature, Documentation

Short Description

  • Add Average Precision (also known as AUPRC) support for the binary case to metrics package
  • Fix examples section in docstrings of evaluate package
  • Add doctest to pre-commit and hence to the test suite to check if docstring examples are working correctly

Tests Added

  • tests/cyclops/evaluate/metrics/test_average_precision.py

@amrit110 amrit110 added documentation Improvements or additions to documentation enhancement New feature or request labels Dec 5, 2023
@amrit110 amrit110 self-assigned this Dec 5, 2023
Copy link

codecov bot commented Dec 5, 2023

Codecov Report

Merging #522 (de3cf07) into main (49b7fd8) will increase coverage by 0.06%.
The diff coverage is 69.64%.

Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #522      +/-   ##
==========================================
+ Coverage   64.80%   64.86%   +0.06%     
==========================================
  Files          93       95       +2     
  Lines        9231     9270      +39     
==========================================
+ Hits         5982     6013      +31     
- Misses       3249     3257       +8     
Files Coverage Δ
cyclops/evaluate/metrics/accuracy.py 97.91% <ø> (ø)
cyclops/evaluate/metrics/auroc.py 92.15% <ø> (ø)
.../evaluate/metrics/experimental/confusion_matrix.py 98.83% <100.00%> (ø)
.../metrics/experimental/distributed_backends/base.py 88.46% <100.00%> (ø)
...etrics/experimental/distributed_backends/mpi4py.py 36.73% <100.00%> (ø)
...etrics/experimental/functional/confusion_matrix.py 89.47% <100.00%> (ø)
cyclops/evaluate/metrics/experimental/metric.py 82.92% <100.00%> (ø)
cyclops/evaluate/metrics/experimental/utils/ops.py 70.31% <100.00%> (ø)
...clops/evaluate/metrics/experimental/utils/types.py 85.71% <ø> (ø)
.../evaluate/metrics/experimental/utils/validation.py 84.44% <100.00%> (ø)
... and 19 more

Impacted file tree graph

Copy link
Collaborator

@fcogidi fcogidi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good.

doctest is quite helpful.

@amrit110 amrit110 marked this pull request as ready for review December 6, 2023 14:27
@amrit110 amrit110 merged commit cfa1eff into main Dec 6, 2023
7 checks passed
@amrit110 amrit110 deleted the add_doctest_avg_precision branch December 6, 2023 14:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants