-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Avoid model recreation at every call of reluplex_step #93
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for putting together this PR @castrong!
In addition to Mykel's comment about debug code, I think there is even greater opportunity for simplification once this major change is made. For example, I think the relu_status
variable becomes obsolete in reluplex_step
after this change, and can therefore be removed. As a result, the loop on 1:2
can even become a loop over for repair in (type_one_repair, type_two_repair)
. I even think the entire function even goes away! This really simplifies the algorithm's look a lot.
@tomerarnon I was considering keeping relu_status around for debugging purpose but it would also be cleaner without. I'll try to implement some of the changes you suggested - I'm not sure I get what you mean by the entire function goes away? Could you elaborate on that a bit more? Thanks! |
It was a typo! I meant the entire |
By the way, it seems the tests are failing currently because |
Ok I tried changing that and refactored the for loop to be cleaner as suggested :) - I'll see if it breaks any tests now |
Looks like that fixed it! One final stylistic suggestion before merging: how about instead of new_constraints = repair!(...)
...
delete.(model, new_constraints) |
sounds great - just pushed that - also should I be worried about the coverage/coveralls check? |
Awesome. We already know it works, but I'll wait for the tests out of paranoia anyway.
Nope. <1% in either direction isn't meaningful anyway. |
Currently, the model is recreated whenever a ReLU is fixed. This changes it so that (1) the desired constraints are added to the model to fix the ReLU to be active or inactive, (2) the recursive call is made, then (3) the constraints are removed from the model. This is now possible b/c of added functionality to the JuMP optimization library.