Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loss Weighting Overridden in IBP Training #2102

Closed
GiulioZizzo opened this issue Apr 16, 2023 · 0 comments · Fixed by #2112
Closed

Loss Weighting Overridden in IBP Training #2102

GiulioZizzo opened this issue Apr 16, 2023 · 0 comments · Fixed by #2112
Assignees
Labels
bug Something isn't working
Projects
Milestone

Comments

@GiulioZizzo
Copy link
Collaborator

Describe the bug
If the user chooses not to use a scheduler for the loss weighting, and wants to use a fixed weighting, their supplied value is overridden and a default one is used instead.

To Reproduce
Steps to reproduce the behavior:

  1. Go to ibp_certified_trainer_pytorch.py
  2. Scroll down to line 304
  3. loss_weighting_k = 0.1 should be loss_weighting_k = self.loss_weighting

System information (please complete the following information):

  • OS: Mac OS
  • Python version: Python 3.8
  • ART version or commit number: 1.14.0
  • PyTorch 1.13.1
@beat-buesser beat-buesser added the bug Something isn't working label Apr 18, 2023
@beat-buesser beat-buesser added this to the ART 1.14.1 milestone Apr 18, 2023
@GiulioZizzo GiulioZizzo mentioned this issue Apr 18, 2023
12 tasks
@beat-buesser beat-buesser linked a pull request Apr 18, 2023 that will close this issue
12 tasks
@beat-buesser beat-buesser added this to Issues in progress in ART 1.14.1 Apr 18, 2023
@beat-buesser beat-buesser moved this from Issues in progress to Issues closed in ART 1.14.1 Apr 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
No open projects
ART 1.14.1
Issues closed
Development

Successfully merging a pull request may close this issue.

2 participants