Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2 problems found in doc Benders Decomposition #3531

Closed
WalterMadelim opened this issue Oct 2, 2023 · 0 comments · Fixed by #3532
Closed

2 problems found in doc Benders Decomposition #3531

WalterMadelim opened this issue Oct 2, 2023 · 0 comments · Fixed by #3532

Comments

@WalterMadelim
Copy link

WalterMadelim commented Oct 2, 2023

In Callback method,

  1. line 4 in the first code block, the objective should be θ + c_1' * x, not θ ? I don't understand the reason not adding c1 * x
  2. line 4 counting from bottom, in the second code block, inside function my_callback(cb_data), model should change to lazy_model ? Is model meant to be the one in iterative method? I don't know the motivation.

After making these 2 changes, I got the reasonable answer below

julia> x_optimal = value.(x)
2-element Vector{Float64}:
 -0.0
  1.0
julia> y_optimal = optimal_ret.y
2-element Vector{Float64}:
 0.0
 0.0

which is from the complete source code below, please try it

using JuMP
import Gurobi
import Printf

function solve_subproblem(x)
    model = Model(Gurobi.Optimizer)
    @variable(model, y[1:dim_y] >= 0)
    con = @constraint(model, A_2 * y .<= b - A_1 * x)
    @objective(model, Min, c_2' * y)
    optimize!(model)
    @assert termination_status(model) == OPTIMAL
    return (obj = objective_value(model), y = value.(y), π = dual.(con))
end

function print_iteration(k, args...)
    f(x) = Printf.@sprintf("%12.4e", x)
    println(lpad(k, 9), " ", join(f.(args), " "))
    return
end

c_1 = [1, 4]
c_2 = [2, 3]
dim_x = length(c_1)
dim_y = length(c_2)
b = [-2; -3]
A_1 = [1 -3; -1 -3]
A_2 = [1 -2; -1 -1]
M = -1000;
MAXIMUM_ITERATIONS = 100
ABSOLUTE_OPTIMALITY_GAP = 1e-6

lazy_model = Model(Gurobi.Optimizer)
@variable(lazy_model, x[1:dim_x] >= 0, Int)
@variable(lazy_model, θ >= M)
@objective(lazy_model, Min, θ + c_1' * x) # problem here
print(lazy_model)

k = 0

"""
    my_callback(cb_data)

A callback that implements Benders decomposition. Note how similar it is to the
inner loop of the iterative method.
"""
function my_callback(cb_data)
    global k += 1
    x_k = callback_value.(cb_data, x)
    θ_k = callback_value(cb_data, θ)
    lower_bound = c_1' * x_k + θ_k
    ret = solve_subproblem(x_k)
    upper_bound = c_1' * x_k + c_2' * ret.y
    gap = (upper_bound - lower_bound) / upper_bound
    print_iteration(k, lower_bound, upper_bound, gap)
    if gap < ABSOLUTE_OPTIMALITY_GAP
        println("Terminating with the optimal solution")
        return
    end
    cut = @build_constraint>= ret.obj + -ret.π' * A_1 * (x .- x_k))
    MOI.submit(lazy_model, MOI.LazyConstraint(cb_data), cut)  # problem here
    return
end

set_attribute(lazy_model, MOI.LazyConstraintCallback(), my_callback)

optimize!(lazy_model)

x_optimal = value.(x)

optimal_ret = solve_subproblem(x_optimal)
y_optimal = optimal_ret.y

If change 1 is not made, we have

julia> x_optimal = value.(x)
2-element Vector{Float64}:
 2.0e9
 2.0e9
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging a pull request may close this issue.

1 participant