Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Specificity metric #210

Merged
merged 16 commits into from May 3, 2021
Merged

Add Specificity metric #210

merged 16 commits into from May 3, 2021

Conversation

arv-77
Copy link
Contributor

@arv-77 arv-77 commented Apr 28, 2021

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

What does this PR do?

Fixes #106 .
Added classification and functional for specificity. Also added tests

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@pep8speaks
Copy link

pep8speaks commented Apr 28, 2021

Hello @arvindmuralie77! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2021-05-03 17:25:02 UTC

@Borda Borda added enhancement New feature or request New metric labels Apr 28, 2021
@maximsch2
Copy link
Contributor

maximsch2 commented Apr 29, 2021

Can we make the testing simpler by flipping the labels and predictions and calculating regular recall (by calling sklearn functions)

@SkafteNicki SkafteNicki changed the title Recall Add Specificity metric Apr 29, 2021
Copy link
Member

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall LGTM, some extra comments:

  • remember to add entry to the docs/source/references/modules.rst and docs/source/references/functional.rst
  • remember to add entry to CHANGELOG.md file

torchmetrics/classification/specificity.py Outdated Show resolved Hide resolved
torchmetrics/classification/specificity.py Outdated Show resolved Hide resolved
torchmetrics/functional/classification/specificity.py Outdated Show resolved Hide resolved
torchmetrics/functional/classification/specificity.py Outdated Show resolved Hide resolved
torchmetrics/functional/classification/specificity.py Outdated Show resolved Hide resolved
torchmetrics/classification/specificity.py Outdated Show resolved Hide resolved
@codecov
Copy link

codecov bot commented Apr 29, 2021

Codecov Report

Merging #210 (a467b73) into master (e46a7d2) will decrease coverage by 0.05%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #210      +/-   ##
==========================================
- Coverage   96.68%   96.62%   -0.06%     
==========================================
  Files         180       92      -88     
  Lines        5786     2935    -2851     
==========================================
- Hits         5594     2836    -2758     
+ Misses        192       99      -93     
Flag Coverage Δ
Linux 79.76% <85.36%> (+0.07%) ⬆️
Windows 79.76% <85.36%> (+0.07%) ⬆️
cpu 96.62% <100.00%> (-0.06%) ⬇️
gpu ?
macOS 96.62% <100.00%> (-0.06%) ⬇️
pytest 96.62% <100.00%> (-0.06%) ⬇️
python3.6 ?
python3.8 96.59% <100.00%> (-0.06%) ⬇️
python3.9 96.59% <100.00%> (+0.04%) ⬆️
torch1.3.1 ?
torch1.4.0 ?
torch1.8.1 96.59% <100.00%> (+0.04%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
torchmetrics/__init__.py 100.00% <ø> (ø)
torchmetrics/classification/__init__.py 100.00% <100.00%> (ø)
torchmetrics/classification/specificity.py 100.00% <100.00%> (ø)
torchmetrics/functional/__init__.py 100.00% <100.00%> (ø)
torchmetrics/functional/classification/__init__.py 100.00% <100.00%> (ø)
...chmetrics/functional/classification/specificity.py 100.00% <100.00%> (ø)
torchmetrics/average.py 83.33% <0.00%> (-12.50%) ⬇️
__w/2/s/torchmetrics/utilities/__init__.py
...2/s/torchmetrics/retrieval/mean_reciprocal_rank.py
...unctional/classification/precision_recall_curve.py
... and 89 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update e46a7d2...a467b73. Read the comment docs.

CHANGELOG.md Outdated Show resolved Hide resolved
@SkafteNicki
Copy link
Member

Hi @arvindmuralie77,
Code is looking great. Can I request one more change? We are in the process of adding test for differentiability, so it would be great if you could:

@mergify mergify bot removed the has conflicts label Apr 30, 2021
Copy link
Member

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, only change is to remove the unused arguments. I know that other metrics are doing this, but lets try not to do it.

torchmetrics/functional/classification/specificity.py Outdated Show resolved Hide resolved
torchmetrics/functional/classification/specificity.py Outdated Show resolved Hide resolved
torchmetrics/functional/classification/specificity.py Outdated Show resolved Hide resolved
torchmetrics/functional/classification/specificity.py Outdated Show resolved Hide resolved
torchmetrics/classification/specificity.py Outdated Show resolved Hide resolved
@SkafteNicki SkafteNicki added this to the v0.4 milestone May 3, 2021
Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@Borda Borda enabled auto-merge (squash) May 3, 2021 17:20
@Borda Borda merged commit df6e4ba into Lightning-AI:master May 3, 2021
@Borda Borda mentioned this pull request Jun 13, 2021
4 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add Specificity
5 participants