-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bound backward error of BoundMaxPool layer #5
Comments
Hi @persistz, |
@persistz @Melcfrn Apologize for my late response. This is a known limitation of our current implementation, and so far we only support one maxpooling layer and a linear layer must be used after the maxpooling layer. If your architecture does not meet this requirement, you can probably make it work by disabling Patch. If you network is small, it might work. You can do this by changing the We are working on a more general support for maxpooling layers, but this is not a top priority at this point so progress can be a bit slow. I will let you know when it is ready. Thank you. |
Hi @huanzhang12 , Thanks for your updating and I will check it and close this issue soon. |
I trained several adv trained models based on the origin github repo of Madry and Trades, even with the defination in
alpha-beta-CROWN/complete_verifier/model_defs.py
.When I use
robustness_verifier.py
to validate these models, an same error raised for all of them.The error caused by the code
assert type(last_lA) == torch.Tensor or type(last_uA) == torch.Tensor
inauto_LiRPA/operators/convolution.py
-> classBoundMaxPool
-> functionbound_backward
.When I check the type of
last_lA
, I found it is patches not tensor, so cannot pass the assert check.I then checked the demos in
exp_configs
, I want to use a model and config provided by you to reproduce the error but I found unfortunately that it seems no model contains the MaxPool layer. If you need me to provide a model for reproducing this problem, I will be happy to do so.The text was updated successfully, but these errors were encountered: