Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add InfoLM #915

Merged
merged 55 commits into from Jul 12, 2022
Merged

Add InfoLM #915

merged 55 commits into from Jul 12, 2022

Conversation

stancld
Copy link
Contributor

@stancld stancld commented Mar 25, 2022

What does this PR do?

Fixes #849

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

I've prepared a personal repository with scripts/instructions for reproducing test results -> will be also useful for #852 and #854 if the original package installation will not be available.

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@stancld stancld added this to the v0.8 milestone Mar 25, 2022
@stancld stancld added this to In progress in Text via automation Mar 25, 2022
@stancld stancld added enhancement New feature or request New metric labels Mar 25, 2022
torchmetrics/functional/text/infolm.py Outdated Show resolved Hide resolved
torchmetrics/functional/text/infolm.py Outdated Show resolved Hide resolved
torchmetrics/functional/text/infolm.py Outdated Show resolved Hide resolved
@Borda Borda modified the milestones: v0.8, v0.9 Apr 5, 2022
@codecov
Copy link

codecov bot commented Apr 9, 2022

Codecov Report

Merging #915 (5a8f6f0) into master (c7b7c46) will decrease coverage by 0%.
The diff coverage is 94%.

@@          Coverage Diff           @@
##           master   #915    +/-   ##
======================================
- Coverage      94%    94%    -0%     
======================================
  Files         182    185     +3     
  Lines        8147   8389   +242     
======================================
+ Hits         7654   7880   +226     
- Misses        493    509    +16     

@mergify mergify bot removed the has conflicts label Jun 30, 2022
@stancld
Copy link
Contributor Author

stancld commented Jul 7, 2022

@justusschock @SkafteNicki Thanks a lot for your reviews, I'll resolve the remaining issues tomorrow.

@Borda Not sure how to fix the issue with those daemonic processes having children -- gonna probe this in more depth tomorrow as well :]

Copy link
Member

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @stancld,
I did a bit of digging and it seem the problem is using multiple number of workers in the dataloader, as this will try to spawn multiple processes within each ddp process (they are spawned for testing) which is not allowed. It seems to be working for BERTScore so I am not sure why this is not working for this metric

src/torchmetrics/functional/text/infolm.py Outdated Show resolved Hide resolved
src/torchmetrics/functional/text/infolm.py Outdated Show resolved Hide resolved
src/torchmetrics/text/infolm.py Outdated Show resolved Hide resolved
* Skip tests if transformers are not available
* Mark tests as skip if there's again any connection issue to HF Hub
@stancld
Copy link
Contributor Author

stancld commented Jul 8, 2022

@Borda Mind have a look at some of your previous comments if they're now settled? :]

@stancld
Copy link
Contributor Author

stancld commented Jul 8, 2022

Hi @stancld, I did a bit of digging and it seem the problem is using multiple number of workers in the dataloader, as this will try to spawn multiple processes within each ddp process (they are spawned for testing) which is not allowed. It seems to be working for BERTScore so I am not sure why this is not working for this metric

I think the problem arises when dist_sync_on_step=True, and I suspect this configuration is not tested for BERTScore as it uses a non-standard test suite as suggested in #1110 -> gonna fix bertscore testing once this merged :]

@Borda
Copy link
Member

Borda commented Jul 11, 2022

@stancld good with me 🎉 how about the very last comments?
cc: @justusschock @SkafteNicki 🐰

@mergify mergify bot added the ready label Jul 11, 2022
@Borda Borda enabled auto-merge (squash) July 11, 2022 21:01
Copy link
Member

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

src/torchmetrics/functional/__init__.py Outdated Show resolved Hide resolved
Text automation moved this from In progress to Reviewer approved Jul 12, 2022
@Borda
Copy link
Member

Borda commented Jul 12, 2022

@stancld mind checking the remaining comments and mar as resolved if it so... then it ll be merged 🎉

@Borda Borda merged commit 4ebb4a2 into master Jul 12, 2022
Text automation moved this from Reviewer approved to Done Jul 12, 2022
@Borda Borda deleted the metric/InfoLM branch July 12, 2022 14:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
No open projects
Text
Done
Development

Successfully merging this pull request may close these issues.

Add InfoLM
4 participants