Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IBP Loss Weighting #2112

Merged

Conversation

GiulioZizzo
Copy link
Collaborator

Description

We fix an issue where the default value for a loss weighting parameter was being used rather than user supplied inputs in certain situations. Further, we add additional input checking to the ibp trainer.

Fixes #2102

Type of change

Please check all relevant options.

  • Improvement (non-breaking)
  • Bug fix (non-breaking)
  • New feature (non-breaking)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Testing

Existing tests pass

Test Configuration:

  • OS: Mac OS
  • Python version: 3.8
  • ART version or commit number: 1.14
  • Pytorch 1.13.1

Checklist

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes

Signed-off-by: GiulioZizzo <giulio.zizzo@yahoo.co.uk>
Signed-off-by: GiulioZizzo <giulio.zizzo@yahoo.co.uk>
@codecov-commenter
Copy link

codecov-commenter commented Apr 18, 2023

Codecov Report

Merging #2112 (bbeae7d) into dev_1.14.1 (6a8d734) will decrease coverage by 8.33%.
The diff coverage is 0.00%.

📣 This organization is not using Codecov’s GitHub App Integration. We recommend you install it so Codecov can continue to function properly for your repositories. Learn more

Impacted file tree graph

@@              Coverage Diff               @@
##           dev_1.14.1    #2112      +/-   ##
==============================================
- Coverage       85.64%   77.32%   -8.33%     
==============================================
  Files             297      297              
  Lines           26506    26512       +6     
  Branches         4861     4864       +3     
==============================================
- Hits            22701    20500    -2201     
- Misses           2568     4908    +2340     
+ Partials         1237     1104     -133     
Impacted Files Coverage Δ
.../defences/trainer/ibp_certified_trainer_pytorch.py 75.00% <0.00%> (-3.17%) ⬇️

... and 28 files with indirect coverage changes

@beat-buesser beat-buesser self-requested a review April 18, 2023 20:45
@beat-buesser beat-buesser self-assigned this Apr 18, 2023
@beat-buesser beat-buesser added bug Something isn't working improvement Improve implementation labels Apr 18, 2023
@beat-buesser beat-buesser added this to the ART 1.14.1 milestone Apr 18, 2023
@beat-buesser beat-buesser linked an issue Apr 18, 2023 that may be closed by this pull request
@beat-buesser beat-buesser added this to Pull request open in ART 1.14.1 Apr 18, 2023
@beat-buesser beat-buesser moved this from Pull request open to Pull request review in ART 1.14.1 Apr 18, 2023
Copy link
Collaborator

@beat-buesser beat-buesser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @GiulioZizzo Thank you very much, looks good to me.

@beat-buesser beat-buesser merged commit 63f3501 into Trusted-AI:dev_1.14.1 Apr 20, 2023
ART 1.14.1 automation moved this from Pull request review to Pull request done Apr 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working improvement Improve implementation
Projects
No open projects
ART 1.14.1
Pull request done
Development

Successfully merging this pull request may close these issues.

Loss Weighting Overridden in IBP Training
3 participants