Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Made val & test loss like train loss in LogisticRegression #664

Merged
merged 3 commits into from
Jun 16, 2021
Merged

Made val & test loss like train loss in LogisticRegression #664

merged 3 commits into from
Jun 16, 2021

Conversation

garryod
Copy link
Contributor

@garryod garryod commented Jun 16, 2021

What does this PR do?

Fixes dissimilarity between train & val/test losses introduced by #655

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests? [not needed for typos/docs]
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

  • Is this pull request ready for review? (if not, please submit in draft mode)

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@github-actions github-actions bot added the model label Jun 16, 2021
@garryod garryod changed the title Made val & test loss like train loss Made val & test loss like train loss in LogisticRegression Jun 16, 2021
@codecov
Copy link

codecov bot commented Jun 16, 2021

Codecov Report

Merging #664 (380a378) into master (01269af) will decrease coverage by 53.50%.
The diff coverage is 36.52%.

❗ Current head 380a378 differs from pull request most recent head f418994. Consider uploading reports for the commit f418994 to get more accurate results
Impacted file tree graph

@@             Coverage Diff             @@
##           master     #664       +/-   ##
===========================================
- Coverage   79.04%   25.54%   -53.51%     
===========================================
  Files         102      118       +16     
  Lines        5912     7110     +1198     
===========================================
- Hits         4673     1816     -2857     
- Misses       1239     5294     +4055     
Flag Coverage Δ
cpu 25.54% <36.52%> (-0.24%) ⬇️
pytest 25.54% <36.52%> (-0.24%) ⬇️
unittests ?

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
pl_bolts/callbacks/byol_updates.py 45.45% <0.00%> (-54.55%) ⬇️
pl_bolts/callbacks/variational.py 29.78% <0.00%> (-66.14%) ⬇️
pl_bolts/datasets/dummy_dataset.py 38.33% <0.00%> (-61.67%) ⬇️
pl_bolts/datasets/ssl_amdim_datasets.py 19.17% <0.00%> (-57.85%) ⬇️
pl_bolts/models/gans/basic/basic_gan_module.py 19.35% <0.00%> (-76.00%) ⬇️
pl_bolts/models/regression/linear_regression.py 23.68% <0.00%> (-74.83%) ⬇️
pl_bolts/models/rl/__init__.py 0.00% <0.00%> (-88.89%) ⬇️
pl_bolts/models/rl/common/agents.py 0.00% <0.00%> (-100.00%) ⬇️
pl_bolts/models/rl/common/gym_wrappers.py 0.00% <0.00%> (-91.60%) ⬇️
pl_bolts/models/rl/double_dqn_model.py 0.00% <0.00%> (-95.66%) ⬇️
... and 164 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update aad3d24...f418994. Read the comment docs.

@Borda Borda added the ready label Jun 16, 2021
@Borda Borda merged commit 40b6f78 into Lightning-Universe:master Jun 16, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants