diff --git a/docs/src/solvers.md b/docs/src/solvers.md index 67d1040c360..ad4edeeae0a 100644 --- a/docs/src/solvers.md +++ b/docs/src/solvers.md @@ -1,7 +1,89 @@ Interacting with solvers ======================== -TODO: Describe the connection between JuMP and solvers. Automatic vs. Manual -mode. CachingOptimizer. How to set/change solvers. How to set parameters (solver +A JuMP model keeps a [MathOptInterface (MOI)](https://github.com/JuliaOpt/MathOptInterface.jl) +*backend* of type `MOI.ModelLike` internally that stores the optimization +problem and acts as the optimization solver. We call it an MOI *backend* and +not optimizer as it can also be a wrapper around an optimization file format +such as MPS that writes the JuMP model in a file. JuMP can be viewed as a +lightweight user-friendly layer on top of the MOI backend: + +* JuMP does not maintain any copy of the model outside this MOI backend. +* JuMP variable (resp. constraint) references are simple structures containing + both a reference to the JuMP model and the MOI index of the variable (resp. + constraint). +* JuMP gives the constraints to the MOI backend in the form provided by the user + without doing any automatic reformulation. +* variables additions, constraints additions/modifications and objective + modifications are directly applied to the MOI backend thus expecting the + backend to support such modifications. + +While this allows JuMP to be a thin wrapper on top of the solver API, as +mentioned in the last point above, this seems rather demanding on the solver. +Indeed, while some solvers support incremental building of the model and +modifications before and after solve, other solvers only support the model being +copied at once before solve. Moreover it seems to require all solvers to +implement all possible reformulations independently which seems both very +ambitious and might generate a lot of duplicated code. + +These apparent limitations are in fact addressed at the MOI level in a manner +that is completely transparent to JuMP. While the MOI API may seem very +demanding, it allows MOI models to be a succession of lightweight MOI layers +that fill the gap between JuMP requirements and the solver capabilities. + +JuMP models can be created in three different modes: Automatic, Manual and +Direct. + +## Automatic and Manual modes + +In Automatic and Manual modes, two MOI layers are automatically applied to the +optimizer: + +* `CachingOptimizer`: maintains a cache of the model so that when the optimizer + does not support an incremental change to the model, the optimizer's internal + model can be discarded and restored from the cache just before optimization. + The `CachingOptimizer` has two different modes: Automatic and Manual + corresponding to the two JuMP modes with the same names. +* `LazyBridgeOptimizer` (this can be disabled using the `bridge_constraints` + keyword argument to [`Model`](@ref) constructor): when a constraint added is + not supported by the optimizer, it tries transform the constraint into an + equivalent form, possibly adding new variables and constraints that are + supported by the optimizer. The applied transformations are selected among + known recipes which are called bridges. A few default bridges are defined in + MOI but new ones can be defined and added to the `LazyBridgeOptimizer` used by + JuMP. + +See the [MOI documentation](http://www.juliaopt.org/MathOptInterface.jl/stable/) +for more details on these two MOI layers. + +To create a fresh new JuMP model, JuMP needs to create a new empty optimizer +instance. New optimizer instances can be obtained using an +[`OptimizerFactory`](@ref) that can be created using the +[`with_optimizer`](@ref) function: +```@docs +with_optimizer +``` + +The factory can be set to the JuMP model in the [`JuMP.optimize`](@ref) +function: +```@docs +JuMP.optimize +``` + +New JuMP models are created using the [`Model`](@ref) constructor: +```@docs +Model() +Model(::JuMP.OptimizerFactory) +``` + +## Direct mode + +JuMP models can be created in Direct mode using the [`JuMP.direct_model`](@ref) +function. +```@docs +JuMP.direct_model +``` + +TODO: How to set parameters (solver specific and generic). Status codes. Accessing the result. How to accurately measure the solve time. diff --git a/src/JuMP.jl b/src/JuMP.jl index ec2012ade13..e1dc65578d4 100644 --- a/src/JuMP.jl +++ b/src/JuMP.jl @@ -29,6 +29,7 @@ using .Derivatives export # Objects Model, VariableRef, Norm, AffExpr, QuadExpr, + with_optimizer, # LinearConstraint, QuadConstraint, SDConstraint, NonlinearConstraint, ConstraintRef, @@ -77,6 +78,60 @@ const MOIBIN = MOICON{MOI.SingleVariable,MOI.ZeroOne} @MOIU.model JuMPMOIModel (ZeroOne, Integer) (EqualTo, GreaterThan, LessThan, Interval) (Zeros, Nonnegatives, Nonpositives, SecondOrderCone, RotatedSecondOrderCone, GeometricMeanCone, PositiveSemidefiniteConeTriangle, PositiveSemidefiniteConeSquare, RootDetConeTriangle, RootDetConeSquare, LogDetConeTriangle, LogDetConeSquare) () (SingleVariable,) (ScalarAffineFunction,ScalarQuadraticFunction) (VectorOfVariables,) (VectorAffineFunction,) +""" + OptimizerFactory + +User-friendly closure that creates new MOI models. New `OptimizerFactory`s are +created with [`with_optimizer`](@ref) and new models are created from the +factory with [`create_model`](@ref). + +## Examples + +The following construct a factory and then use it to create two independent +`IpoptOptimizer`s: +```julia +factory = with_optimizer(IpoptOptimizer, print_level=0) +optimizer1 = JuMP.create_model(factory) +optimizer2 = JuMP.create_model(factory) +``` +""" +struct OptimizerFactory + # The constructor can be + # * `Function`: a function, or + # * `DataType`: a type, or + # * `UnionAll`: a type with missing parameters. + constructor::Union{Function, DataType, UnionAll} + args::Tuple + kwargs # type changes from Julia v0.6 to v0.7 so we leave it untyped for now +end + +""" + with_optimizer(constructor::Type, args...; kwargs...) + +Return a factory that creates optimizers using the constructor `constructor` +with positional arguments `args` and keyword arguments `kwargs`. + +## Examples + +The following returns a factory that creates `IpoptOptimizer`s using the +constructor call `IpoptOptimizer(print_level=0)`: +```julia +with_optimizer(IpoptOptimizer, print_level=0) +``` +""" +function with_optimizer(constructor::Type, args...; kwargs...) + return OptimizerFactory(constructor, args, kwargs) +end + +""" + create_model(factory::OptimizerFactory) + +Creates a new model with the factory `factory`. +""" +function create_model(factory::OptimizerFactory) + return factory.constructor(factory.args...; factory.kwargs...) +end + ############################################################################### # Model @@ -123,50 +178,102 @@ mutable struct Model <: AbstractModel # Enable extensions to attach arbitrary information to a JuMP model by # using an extension-specific symbol as a key. ext::Dict{Symbol, Any} +end - # Default constructor. - function Model(; - mode::ModelMode=Automatic, - backend=nothing, - optimizer=nothing, - bridge_constraints=true) - model = new() - model.variabletolowerbound = Dict{MOIVAR, MOILB}() - model.variabletoupperbound = Dict{MOIVAR, MOIUB}() - model.variabletofix = Dict{MOIVAR, MOIFIX}() - model.variabletointegrality = Dict{MOIVAR, MOIINT}() - model.variabletozeroone = Dict{MOIVAR, MOIBIN}() - model.customnames = VariableRef[] - if backend != nothing - # TODO: It would make more sense to not force users to specify - # Direct mode if they also provide a backend. - @assert mode == Direct - @assert optimizer === nothing - @assert MOI.isempty(backend) - model.moibackend = backend - else - @assert mode != Direct - universal_fallback = MOIU.UniversalFallback(JuMPMOIModel{Float64}()) - caching_mode = (mode == Automatic) ? MOIU.Automatic : MOIU.Manual - caching_opt = MOIU.CachingOptimizer(universal_fallback, - caching_mode) - if bridge_constraints - model.moibackend = MOI.Bridges.fullbridgeoptimizer(caching_opt, - Float64) - else - model.moibackend = caching_opt - end - if optimizer !== nothing - MOIU.resetoptimizer!(model, optimizer) - end - end - model.optimizehook = nothing - model.nlpdata = nothing - model.objdict = Dict{Symbol, Any}() - model.operator_counter = 0 - model.ext = Dict{Symbol, Any}() - return model +""" + Model(moibackend::MOI.ModelLike) + +Return a new JuMP model with MOI backend `moibackend`. This constructor is a +low-level constructor used by [`Model()`](@ref), +[`Model(::OptimizerFactory)`](@ref) and [`direct_model`](@ref). +""" +function Model(moibackend::MOI.ModelLike) + @assert MOI.isempty(moibackend) + return Model(Dict{MOIVAR, MOILB}(), + Dict{MOIVAR, MOIUB}(), + Dict{MOIVAR, MOIFIX}(), + Dict{MOIVAR, MOIINT}(), + Dict{MOIVAR, MOIBIN}(), + VariableRef[], + moibackend, + nothing, + nothing, + Dict{Symbol, Any}(), + 0, + Dict{Symbol, Any}()) +end + +""" + Model(; caching_mode::MOIU.CachingOptimizerMode=MOIU.Automatic, + bridge_constraints::Bool=true) + +Return a new JuMP model without any optimizer; the model is stored the model in +a cache. The mode of the `CachingOptimizer` storing this cache is +`caching_mode`. The optimizer can be set later with [`set_optimizer`](@ref). If +`bridge_constraints` is true, constraints that are not supported by the +optimizer are automatically bridged to equivalent supported constraints when +an appropriate is defined in the `MathOptInterface.Bridges` module or is +defined in another module and is explicitely added. +""" +function Model(; caching_mode::MOIU.CachingOptimizerMode=MOIU.Automatic, + bridge_constraints::Bool=true) + universal_fallback = MOIU.UniversalFallback(JuMPMOIModel{Float64}()) + caching_opt = MOIU.CachingOptimizer(universal_fallback, + caching_mode) + if bridge_constraints + backend = MOI.Bridges.fullbridgeoptimizer(caching_opt, + Float64) + else + backend = caching_opt end + return Model(backend) +end + +""" + Model(factory::OptimizerFactory; + caching_mode::MOIU.CachingOptimizerMode=MOIU.Automatic, + bridge_constraints::Bool=true) + +Return a new JuMP model using the factory `factory` to create the optimizer. +This is equivalent to calling `Model` with the same keyword arguments and then +calling [`set_optimizer`](@ref) on the created model with the `factory`. The +factory can be created by the [`with_optimizer`](@ref) function. + +## Examples + +The following creates a model using the optimizer +`IpoptOptimizer(print_level=0)`: +```julia +model = JuMP.Model(with_optimizer(IpoptOptimizer, print_level=0)) +``` +""" +function Model(factory::OptimizerFactory; kwargs...) + model = Model(; kwargs...) + optimizer = create_model(factory) + MOIU.resetoptimizer!(model, optimizer) + return model +end + +""" + direct_model(backend::MOI.ModelLike) + +Return a new JuMP model using `backend` to store the model and solve it. As +opposed to the [`Model`](@ref) constructor, no cache of the model is stored +outside of `backend` and no bridges are automatically applied to `backend`. +The absence of cache reduces the memory footprint but it is important to bear +in mind the following implications of creating models using this *direct* mode: + +* When `backend` does not support an operation such as adding + variables/constraints after solver or modifying constraints, an error is + thrown. With models created using the [`Model`](@ref) constructor, such + situations can be dealt with by storing the modifications in a cache and + loading them into the optimizer when `JuMP.optimize` is called. +* No constraint bridging is supported by default. +* The optimizer used cannot be changed the model is constructed. +* The model created cannot be copied. +""" +function direct_model(backend::MOI.ModelLike) + return Model(backend) end # In Automatic and Manual mode, `model.moibackend` is either directly the diff --git a/src/optimizerinterface.jl b/src/optimizerinterface.jl index c7474b3d145..9c316c69e2a 100644 --- a/src/optimizerinterface.jl +++ b/src/optimizerinterface.jl @@ -30,8 +30,17 @@ function MOIU.attachoptimizer!(model::Model) end -function optimize(model::Model; - ignore_optimize_hook=(model.optimizehook===nothing)) +""" + function optimize(model::Model, + factory::Union{Nothing, OptimizerFactory} = nothing; + ignore_optimize_hook=(model.optimizehook===nothing)) + +Optimize the model. If `factory` is not `nothing`, it first set the optimizer +to a new one created using the factory. +""" +function optimize(model::Model, + factory::Union{Nothing, OptimizerFactory} = nothing; + ignore_optimize_hook=(model.optimizehook===nothing)) # The NLPData is not kept in sync, so re-set it here. # TODO: Consider how to handle incremental solves. if model.nlpdata !== nothing @@ -39,6 +48,12 @@ function optimize(model::Model; empty!(model.nlpdata.nlconstr_duals) end + if factory !== nothing + optimizer = create_model(factory) + MOIU.resetoptimizer!(model, optimizer) + MOIU.attachoptimizer!(model) + end + # If the user or an extension has provided an optimize hook, call # that instead of solving the model ourselves if !ignore_optimize_hook diff --git a/test/generate_and_solve.jl b/test/generate_and_solve.jl index 8440ac85813..0c9a535aa43 100644 --- a/test/generate_and_solve.jl +++ b/test/generate_and_solve.jl @@ -39,22 +39,19 @@ MOIU.loadfromstring!(model, modelstring) MOIU.test_models_equal(JuMP.caching_optimizer(m).model_cache, model, ["x","y"], ["c", "xub", "ylb"]) - mocksolver = MOIU.MockOptimizer(JuMP.JuMPMOIModel{Float64}(), evalobjective=false) - MOIU.resetoptimizer!(m, mocksolver) - MOIU.attachoptimizer!(m) - - MOI.set!(mocksolver, MOI.TerminationStatus(), MOI.Success) - MOI.set!(mocksolver, MOI.ObjectiveValue(), -1.0) - MOI.set!(mocksolver, MOI.ResultCount(), 1) - MOI.set!(mocksolver, MOI.PrimalStatus(), MOI.FeasiblePoint) - MOI.set!(mocksolver, MOI.DualStatus(), MOI.FeasiblePoint) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(x), 1.0) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(y), 0.0) - MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(c), -1.0) - MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(JuMP.UpperBoundRef(x)), 0.0) - MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(JuMP.LowerBoundRef(y)), 1.0) - - JuMP.optimize(m) + JuMP.optimize(m, with_optimizer(MOIU.MockOptimizer, JuMP.JuMPMOIModel{Float64}(), evalobjective=false)) + + mockoptimizer = JuMP.caching_optimizer(m).optimizer + MOI.set!(mockoptimizer, MOI.TerminationStatus(), MOI.Success) + MOI.set!(mockoptimizer, MOI.ObjectiveValue(), -1.0) + MOI.set!(mockoptimizer, MOI.ResultCount(), 1) + MOI.set!(mockoptimizer, MOI.PrimalStatus(), MOI.FeasiblePoint) + MOI.set!(mockoptimizer, MOI.DualStatus(), MOI.FeasiblePoint) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(x), 1.0) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(y), 0.0) + MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(c), -1.0) + MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(JuMP.UpperBoundRef(x)), 0.0) + MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(JuMP.LowerBoundRef(y)), 1.0) #@test JuMP.isattached(m) @test JuMP.hasresultvalues(m) @@ -74,24 +71,24 @@ end @testset "LP (Direct mode)" begin - mocksolver = MOIU.MockOptimizer(JuMP.JuMPMOIModel{Float64}(), evalobjective=false) + mockoptimizer = MOIU.MockOptimizer(JuMP.JuMPMOIModel{Float64}(), evalobjective=false) - m = Model(mode = JuMP.Direct, backend = mocksolver) + m = JuMP.direct_model(mockoptimizer) @variable(m, x <= 2.0) @variable(m, y >= 0.0) @objective(m, Min, -x) c = @constraint(m, x + y <= 1) - MOI.set!(mocksolver, MOI.TerminationStatus(), MOI.Success) - MOI.set!(mocksolver, MOI.ObjectiveValue(), -1.0) - MOI.set!(mocksolver, MOI.ResultCount(), 1) - MOI.set!(mocksolver, MOI.PrimalStatus(), MOI.FeasiblePoint) - MOI.set!(mocksolver, MOI.DualStatus(), MOI.FeasiblePoint) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(x), 1.0) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(y), 0.0) - MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(c), -1.0) - MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(JuMP.UpperBoundRef(x)), 0.0) - MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(JuMP.LowerBoundRef(y)), 1.0) + MOI.set!(mockoptimizer, MOI.TerminationStatus(), MOI.Success) + MOI.set!(mockoptimizer, MOI.ObjectiveValue(), -1.0) + MOI.set!(mockoptimizer, MOI.ResultCount(), 1) + MOI.set!(mockoptimizer, MOI.PrimalStatus(), MOI.FeasiblePoint) + MOI.set!(mockoptimizer, MOI.DualStatus(), MOI.FeasiblePoint) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(x), 1.0) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(y), 0.0) + MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(c), -1.0) + MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(JuMP.UpperBoundRef(x)), 0.0) + MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(JuMP.LowerBoundRef(y)), 1.0) JuMP.optimize(m) @@ -115,9 +112,8 @@ # TODO: test Manual mode @testset "IP" begin - mocksolver = MOIU.MockOptimizer(JuMP.JuMPMOIModel{Float64}(), evalobjective=false) # Tests the solver= keyword. - m = Model(mode = JuMP.Automatic, optimizer = mocksolver) + m = Model(with_optimizer(MOIU.MockOptimizer, JuMP.JuMPMOIModel{Float64}(), evalobjective=false), caching_mode = MOIU.Automatic) @variable(m, x == 1.0, Int) @variable(m, y, Bin) @objective(m, Max, x) @@ -140,12 +136,13 @@ MOIU.attachoptimizer!(m) - MOI.set!(mocksolver, MOI.TerminationStatus(), MOI.Success) - MOI.set!(mocksolver, MOI.ObjectiveValue(), 1.0) - MOI.set!(mocksolver, MOI.ResultCount(), 1) - MOI.set!(mocksolver, MOI.PrimalStatus(), MOI.FeasiblePoint) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(x), 1.0) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(y), 0.0) + mockoptimizer = JuMP.caching_optimizer(m).optimizer + MOI.set!(mockoptimizer, MOI.TerminationStatus(), MOI.Success) + MOI.set!(mockoptimizer, MOI.ObjectiveValue(), 1.0) + MOI.set!(mockoptimizer, MOI.ResultCount(), 1) + MOI.set!(mockoptimizer, MOI.PrimalStatus(), MOI.FeasiblePoint) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(x), 1.0) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(y), 0.0) JuMP.optimize(m) @@ -186,22 +183,19 @@ MOIU.loadfromstring!(model, modelstring) MOIU.test_models_equal(JuMP.caching_optimizer(m).model_cache, model, ["x","y"], ["c1", "c2", "c3"]) - mocksolver = MOIU.MockOptimizer(JuMP.JuMPMOIModel{Float64}(), evalobjective=false) - MOIU.resetoptimizer!(m, mocksolver) - MOIU.attachoptimizer!(m) - - MOI.set!(mocksolver, MOI.TerminationStatus(), MOI.Success) - MOI.set!(mocksolver, MOI.ObjectiveValue(), -1.0) - MOI.set!(mocksolver, MOI.ResultCount(), 1) - MOI.set!(mocksolver, MOI.PrimalStatus(), MOI.FeasiblePoint) - MOI.set!(mocksolver, MOI.DualStatus(), MOI.FeasiblePoint) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(x), 1.0) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(y), 0.0) - MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(c1), -1.0) - MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(c2), 2.0) - MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(c3), 3.0) + JuMP.optimize(m, with_optimizer(MOIU.MockOptimizer, JuMP.JuMPMOIModel{Float64}(), evalobjective=false)) - JuMP.optimize(m) + mockoptimizer = JuMP.caching_optimizer(m).optimizer + MOI.set!(mockoptimizer, MOI.TerminationStatus(), MOI.Success) + MOI.set!(mockoptimizer, MOI.ObjectiveValue(), -1.0) + MOI.set!(mockoptimizer, MOI.ResultCount(), 1) + MOI.set!(mockoptimizer, MOI.PrimalStatus(), MOI.FeasiblePoint) + MOI.set!(mockoptimizer, MOI.DualStatus(), MOI.FeasiblePoint) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(x), 1.0) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(y), 0.0) + MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(c1), -1.0) + MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(c2), 2.0) + MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(c3), 3.0) #@test JuMP.isattached(m) @test JuMP.hasresultvalues(m) @@ -244,19 +238,19 @@ MOIU.loadfromstring!(model, modelstring) MOIU.test_models_equal(JuMP.caching_optimizer(m).model_cache, model, ["x","y","z"], ["varsoc", "affsoc", "rotsoc"]) - mocksolver = MOIU.MockOptimizer(JuMP.JuMPMOIModel{Float64}(), evalobjective=false) - MOIU.resetoptimizer!(m, mocksolver) + mockoptimizer = MOIU.MockOptimizer(JuMP.JuMPMOIModel{Float64}(), evalobjective=false) + MOIU.resetoptimizer!(m, mockoptimizer) MOIU.attachoptimizer!(m) - MOI.set!(mocksolver, MOI.TerminationStatus(), MOI.Success) - MOI.set!(mocksolver, MOI.ResultCount(), 1) - MOI.set!(mocksolver, MOI.PrimalStatus(), MOI.FeasiblePoint) - MOI.set!(mocksolver, MOI.DualStatus(), MOI.FeasiblePoint) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(x), 1.0) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(y), 0.0) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(z), 0.0) - MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(varsoc), [-1.0,-2.0,-3.0]) - MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(affsoc), [1.0,2.0,3.0]) + MOI.set!(mockoptimizer, MOI.TerminationStatus(), MOI.Success) + MOI.set!(mockoptimizer, MOI.ResultCount(), 1) + MOI.set!(mockoptimizer, MOI.PrimalStatus(), MOI.FeasiblePoint) + MOI.set!(mockoptimizer, MOI.DualStatus(), MOI.FeasiblePoint) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(x), 1.0) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(y), 0.0) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(z), 0.0) + MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(varsoc), [-1.0,-2.0,-3.0]) + MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(affsoc), [1.0,2.0,3.0]) JuMP.optimize(m) @@ -300,19 +294,19 @@ MOIU.loadfromstring!(model, modelstring) MOIU.test_models_equal(JuMP.caching_optimizer(m).model_cache, model, ["x11","x12","x22"], ["varpsd", "conpsd"]) - mocksolver = MOIU.MockOptimizer(JuMP.JuMPMOIModel{Float64}(), evalobjective=false) - MOIU.resetoptimizer!(m, mocksolver) + mockoptimizer = MOIU.MockOptimizer(JuMP.JuMPMOIModel{Float64}(), evalobjective=false) + MOIU.resetoptimizer!(m, mockoptimizer) MOIU.attachoptimizer!(m) - MOI.set!(mocksolver, MOI.TerminationStatus(), MOI.Success) - MOI.set!(mocksolver, MOI.ResultCount(), 1) - MOI.set!(mocksolver, MOI.PrimalStatus(), MOI.FeasiblePoint) - MOI.set!(mocksolver, MOI.DualStatus(), MOI.FeasiblePoint) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(x[1,1]), 1.0) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(x[1,2]), 2.0) - MOI.set!(mocksolver, MOI.VariablePrimal(), JuMP.optimizerindex(x[2,2]), 4.0) - MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(varpsd), [1.0,2.0,3.0]) - MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(conpsd), [4.0,5.0,6.0]) + MOI.set!(mockoptimizer, MOI.TerminationStatus(), MOI.Success) + MOI.set!(mockoptimizer, MOI.ResultCount(), 1) + MOI.set!(mockoptimizer, MOI.PrimalStatus(), MOI.FeasiblePoint) + MOI.set!(mockoptimizer, MOI.DualStatus(), MOI.FeasiblePoint) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(x[1,1]), 1.0) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(x[1,2]), 2.0) + MOI.set!(mockoptimizer, MOI.VariablePrimal(), JuMP.optimizerindex(x[2,2]), 4.0) + MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(varpsd), [1.0,2.0,3.0]) + MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(conpsd), [4.0,5.0,6.0]) JuMP.optimize(m) diff --git a/test/model.jl b/test/model.jl index 8817c2f7f91..501e0c6c307 100644 --- a/test/model.jl +++ b/test/model.jl @@ -20,24 +20,22 @@ end @testset "Bridges" begin @testset "Automatic bridging" begin # optimizer not supporting Interval - optimizer = MOIU.MockOptimizer(LPModel{Float64}()); - model = Model(optimizer=optimizer) + model = Model(with_optimizer(MOIU.MockOptimizer, LPModel{Float64}())) @variable model x cref = @constraint model 0 <= x + 1 <= 1 @test cref isa JuMP.ConstraintRef{JuMP.Model,MOI.ConstraintIndex{MOI.ScalarAffineFunction{Float64},MOI.Interval{Float64}}} JuMP.optimize(model) end @testset "Automatic bridging disabled with `bridge_constraints` keyword" begin - optimizer = MOIU.MockOptimizer(LPModel{Float64}()); - model = Model(optimizer=optimizer, bridge_constraints=false) + model = Model(with_optimizer(MOIU.MockOptimizer, LPModel{Float64}()), bridge_constraints=false) @test model.moibackend isa MOIU.CachingOptimizer @test model.moibackend === JuMP.caching_optimizer(model) @variable model x @test_throws ErrorException @constraint model 0 <= x + 1 <= 1 end @testset "No bridge automatically added in Direct mode" begin - optimizer = MOIU.MockOptimizer(LPModel{Float64}()); - model = Model(backend=optimizer, mode=JuMP.Direct) + optimizer = MOIU.MockOptimizer(LPModel{Float64}()) + model = JuMP.direct_model(optimizer) @variable model x @test_throws ErrorException @constraint model 0 <= x + 1 <= 1 end diff --git a/test/nlp_solver.jl b/test/nlp_solver.jl index 70b598f12f1..012f87287d3 100644 --- a/test/nlp_solver.jl +++ b/test/nlp_solver.jl @@ -27,8 +27,6 @@ using Compat.Test using MathOptInterface const MOI = MathOptInterface -new_optimizer() = IpoptOptimizer(print_level=0) - @testset "NLP solver tests" begin @testset "HS071" begin @@ -40,7 +38,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) # 1 <= x1, x2, x3, x4 <= 5 # Start at (1,5,5,1) # End at (1.000..., 4.743..., 3.821..., 1.379...) - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) initval = [1,5,5,1] @variable(m, 1 <= x[i=1:4] <= 5, start=initval[i]) @NLobjective(m, Min, x[1]*x[4]*(x[1]+x[2]+x[3]) + x[3]) @@ -65,7 +63,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) # 1 <= x1, x2, x3, x4 <= 5 # Start at (1,5,5,1) # End at (1.000..., 4.743..., 3.821..., 1.379...) - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) initval = [1,5,5,1] @variable(m, 1 <= x[i=1:4] <= 5, start=initval[i]) JuMP.setNLobjective(m, :Min, :($(x[1])*$(x[4])*($(x[1])+$(x[2])+$(x[3])) + $(x[3]))) @@ -91,7 +89,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) L = [0.0, 0.0, -0.55, -0.55, 196, 196, 196, -400, -400] U = [Inf, Inf, 0.55, 0.55, 252, 252, 252, 800, 800] - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) @variable(m, L[i] <= x[i=1:9] <= U[i], start = 0.0) @NLobjective(m, Min, 3 * x[1] + 1e-6 * x[1]^3 + 2 * x[2] + .522074e-6 * x[2]^3) @@ -131,7 +129,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) end @testset "HS110" begin - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) @variable(m, -2.001 <= x[1:10] <= 9.999, start = 9) @NLobjective(m, Min, @@ -152,7 +150,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) @testset "HS111" begin c = [-6.089, -17.164, -34.054, -5.914, -24.721, -14.986, -24.100, -10.708, -26.662, -22.179] - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) @variable(m, -100 <= x[1:10] <= 100, start = -2.3) @NLobjective(m, Min, @@ -174,7 +172,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) @testset "HS112" begin c = [-6.089, -17.164, -34.054, -5.914, -24.721, -14.986, -24.100, -10.708, -26.662, -22.179] - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) @variable(m, x[1:10] >= 1e-6, start = 0.1) @NLobjective(m, Min, sum(x[j]*(c[j] + log(x[j]/sum(x[k] for k=1:10))) for j=1:10)) @@ -201,7 +199,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) upper = [2000, 16000, 120, 5000, 2000, 93, 95, 12, 4, 162] start = [1745, 12000, 110, 3048, 1974, 89.2, 92.8, 8, 3.6, 145] - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) @variable(m, lower[i] <= x[i=1:n] <= upper[i], start = start[i]) @NLobjective(m, Min, 5.04*x[1] + .035*x[2] + 10*x[3] + 3.36*x[5] - .063*x[4]*x[7]) @@ -240,7 +238,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) upper = [1.0, 1.0, 1.0, 0.1, 0.9, 0.9, 1000, 1000, 1000, 500, 150, 150, 150, Inf, Inf, Inf] start = [0.5 2 0.8 3 0.9 4 0.1 5 0.14 6 0.5 7 489 8 80 9 650 0.5 2 0.8 3 0.9 4 0.1 5 0.14 6 0.5 7 489 8 80 9 650] - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) @variable(m, lower[i] <= x[i=1:N] <= upper[i], start = start[i]) @NLobjective(m, Min, x[11] + x[12] + x[13]) @@ -275,7 +273,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) end @testset "HS118" begin - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) L = zeros(15) L[1] = 8.0 @@ -343,7 +341,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) end @testset "Two-sided constraints" begin - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) @variable(m, x) @NLobjective(m, Max, x) l = -1 @@ -366,7 +364,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) end @testset "Two-sided constraints (no macros)" begin - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) @variable(m, x) JuMP.setNLobjective(m, :Max, x) l = -1 @@ -389,7 +387,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) end @testset "Duals" begin - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) @variable(m, x >= 0) @variable(m, y <= 5) @variable(m, 2 <= z <= 4) @@ -445,7 +443,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) end @testset "Quadratic inequality constraints, linear objective" begin - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) @variable(m, -2 <= x <= 2) @variable(m, -2 <= y <= 2) @objective(m, Min, x - y) @@ -460,7 +458,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) end @testset "Quadratic inequality constraints, NL objective" begin - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) @variable(m, -2 <= x <= 2) @variable(m, -2 <= y <= 2) @NLobjective(m, Min, x - y) @@ -475,7 +473,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) end @testset "Quadratic equality constraints" begin - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) @variable(m, 0 <= x[1:2] <= 1) @constraint(m, x[1]^2 + x[2]^2 == 1/2) @NLobjective(m, Max, x[1] - x[2]) @@ -489,7 +487,7 @@ new_optimizer() = IpoptOptimizer(print_level=0) end @testset "Fixed variables" begin - m = Model(optimizer=new_optimizer()) + m = Model(with_optimizer(IpoptOptimizer, print_level=0)) @variable(m, x == 0) @variable(m, y ≥ 0) @objective(m, Min, y)