Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Query of theorem of handling residual networks with ADD layer #43

Closed
JacksonZyy opened this issue Nov 21, 2023 · 1 comment
Closed

Query of theorem of handling residual networks with ADD layer #43

JacksonZyy opened this issue Nov 21, 2023 · 1 comment

Comments

@JacksonZyy
Copy link

Dear,

I am very impressed when reading your papers regarding the theorems where you solve the constrained problems with efficient algorithms that run on GPU.
In the theorem proofs, I notice that the original constrained problem includes constraints x(i) = W(i)x(i-1)+b(i), which only captures fully-connected/convolutional/... layers's behavior.
But for an Add layer in residual networks in ONNX model, its function is like x(i) = x(i-1)+x(i-k). I fail to see how this theorem extends to residual networks, but I did observe residual networks in your experiments.
So I wonder if there is a theorem behind handling the residual networks? And is this theorem (if any) just a customization of your existing theorem?
Thank you in advance for your clarification!

@shizhouxing
Copy link
Member

Hi @JacksonZyy , bound propagation for general computational graphs including ResNet is formulated in auto_LiRPA's paper: https://arxiv.org/abs/2002.12920

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants