You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
However, this is still definitely a bounds issue since completely removing the layer-bounds from reluplex leads to the correct result.
It isn't clear whether the bound calculation is intrinsically wrong somehow, or that its usage isn't quite correct; still needs to be explored.
-- original issue --
The incorrect bounds affect other solvers that use them as constraints, such as Reluplex. This failure case is courtesy of Chris Strong:
nnet =Network([Layer(reshape([1.0; 1.0], (2, 1)), [0.5, 0.1], ReLU()),
Layer([-4.-1.], [1.0], Id())])
# on the interval [-1, 1], nnet ranges in [1, -6.1]. We test# y ≤ 0.05 | x ∈ [-1, 1] which we know is false.
input_set =Hyperrectangle(low = [-1.0], high = [1.0])
output_set =HalfSpace([1.0], 0.05)
problem =Problem(nnet, input_set, output_set)
julia>solve(Reluplex(), problem)
CounterExampleResult(:holds, Float64[])
This is obviously incorrect, and is due to the bounds being overly tight (leading to an infeasible optimization problem). Changing the MaxSens calculation to tight = false yields the correct result (i.e. :violated). See these sections for reference on where this is happening:
functionget_bounds(nnet::Network, input::Hyperrectangle, act::Bool=true) # NOTE there is another function by the same name in convDual. Should reconsider dispatch
if act
solver =MaxSens(0.0, true)
For the return statement in question, see line 60:
@changliuliu, before I go any deeper, is there anything above that strikes you as incorrect?
The text was updated successfully, but these errors were encountered:
tomerarnon
changed the title
The tight version of MaxSens leads to incorrect bounds
Applying layer-bounds in reluplex leads to false :holds
Feb 6, 2020
-- update --
Actually for this other simple network,
tight = false
doesn't solve the problem:However, this is still definitely a bounds issue since completely removing the layer-bounds from reluplex leads to the correct result.
It isn't clear whether the bound calculation is intrinsically wrong somehow, or that its usage isn't quite correct; still needs to be explored.
-- original issue --
The incorrect bounds affect other solvers that use them as constraints, such as Reluplex. This failure case is courtesy of Chris Strong:
This is obviously incorrect, and is due to the bounds being overly tight (leading to an infeasible optimization problem). Changing the MaxSens calculation to
tight = false
yields the correct result (i.e.:violated
). See these sections for reference on where this is happening:The
tight
version is used inget_bounds
:NeuralVerification.jl/src/utils/util.jl
Lines 271 to 273 in b99f427
For the return statement in question, see line 60:
NeuralVerification.jl/src/reachability/maxSens.jl
Lines 53 to 64 in cab24c5
@changliuliu, before I go any deeper, is there anything above that strikes you as incorrect?
The text was updated successfully, but these errors were encountered: