Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Factory #1384

Merged
merged 29 commits into from
Aug 2, 2018
Merged

Factory #1384

merged 29 commits into from
Aug 2, 2018

Conversation

blegat
Copy link
Member

@blegat blegat commented Jul 28, 2018

New constructor interface as discussed here.
Factories allow to create new optimizers on demand which would allow implementing copy (#1381, #1300) without the need for any change in MOI like jump-dev/MathOptInterface.jl#387.

@mlubin mlubin mentioned this pull request Jul 28, 2018
@@ -40,7 +38,7 @@ new_optimizer() = IpoptOptimizer(print_level=0)
# 1 <= x1, x2, x3, x4 <= 5
# Start at (1,5,5,1)
# End at (1.000..., 4.743..., 3.821..., 1.379...)
m = Model(optimizer=new_optimizer())
m = Model(with_optimizer(IpoptOptimize, print_level=0))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typo: IpoptOptimizer. This file isn't run under CI because it depends on Ipopt. Please test manually.

@codecov
Copy link

codecov bot commented Jul 29, 2018

Codecov Report

Merging #1384 into master will decrease coverage by 4.14%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1384      +/-   ##
==========================================
- Coverage   89.58%   85.43%   -4.15%     
==========================================
  Files          25       24       -1     
  Lines        3427     3426       -1     
==========================================
- Hits         3070     2927     -143     
- Misses        357      499     +142
Impacted Files Coverage Δ
src/JuMP.jl 79.28% <100%> (+0.02%) ⬆️
src/optimizerinterface.jl 71.42% <100%> (+4.76%) ⬆️
src/JuMPArray.jl 29.03% <0%> (-47.01%) ⬇️
src/Derivatives/linearity.jl 75.51% <0%> (-20.41%) ⬇️
src/Derivatives/forward.jl 75.19% <0%> (-18.61%) ⬇️
src/Derivatives/sparsity.jl 79.01% <0%> (-16.05%) ⬇️
src/Derivatives/coloring.jl 90.75% <0%> (-6.94%) ⬇️
src/containers.jl 86.84% <0%> (-5.27%) ⬇️
src/Derivatives/conversion.jl 94.33% <0%> (-3.78%) ⬇️
src/Derivatives/subexpressions.jl 97.5% <0%> (-2.5%) ⬇️
... and 5 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 64fc678...39be6af. Read the comment docs.

@codecov
Copy link

codecov bot commented Jul 29, 2018

Codecov Report

Merging #1384 into master will increase coverage by 0.16%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1384      +/-   ##
==========================================
+ Coverage   89.58%   89.75%   +0.16%     
==========================================
  Files          25       25              
  Lines        3427     3698     +271     
==========================================
+ Hits         3070     3319     +249     
- Misses        357      379      +22
Impacted Files Coverage Δ
src/JuMP.jl 76.4% <100%> (-2.86%) ⬇️
src/optimizerinterface.jl 76.92% <100%> (+10.25%) ⬆️
src/variables.jl 87.82% <0%> (-0.8%) ⬇️
src/operators.jl 96.69% <0%> (-0.02%) ⬇️
src/quadexpr.jl 92.5% <0%> (+0.09%) ⬆️
src/affexpr.jl 97.58% <0%> (+0.12%) ⬆️
src/macros.jl 90.53% <0%> (+0.66%) ⬆️
src/nlp.jl 82.05% <0%> (+2.49%) ⬆️
src/print.jl 85.65% <0%> (+5.13%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 64fc678...b4b4f17. Read the comment docs.

@blegat blegat changed the title [WIP] Factory Factory Jul 29, 2018
@IainNZ
Copy link
Collaborator

IainNZ commented Jul 29, 2018

The word backend doesn't seem to appear in MathOptInterface.jl at all - does it have a formal definition?

mode. CachingOptimizer. How to set/change solvers. How to set parameters (solver
A JuMP model stores a
[MathOptInterface (MOI)](https://github.com/JuliaOpt/MathOptInterface.jl)
backend internally that contains the optimization solver. The JuMP layer on top
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"contains" -> "acts as"?

mode. CachingOptimizer. How to set/change solvers. How to set parameters (solver
A JuMP model stores a
[MathOptInterface (MOI)](https://github.com/JuliaOpt/MathOptInterface.jl)
backend internally that contains the optimization solver. The JuMP layer on top
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The backend might also not support optimize!.

mode. CachingOptimizer. How to set/change solvers. How to set parameters (solver
A JuMP model stores a
[MathOptInterface (MOI)](https://github.com/JuliaOpt/MathOptInterface.jl)
backend internally that contains the optimization solver. The JuMP layer on top
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"JuMP can be viewed as a lightweight user-friendly layer on top of the MOI backend.

backend to support such modifications.

While this allows JuMP API to to be a thin wrapper on top of the solver API,
as mentionned in the last point above, this seems rather demanding on the
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spelling: "mentioned"

optimizer:

* `CachingOptimizer`: it maintain a cache of the model so that when the
the optimizer does not support a modification of the model, the optimizer's
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"incremental change" seems more appropriate than "modification" (since it could include adding constraints)

src/JuMP.jl Outdated
@@ -77,6 +78,43 @@ const MOIBIN = MOICON{MOI.SingleVariable,MOI.ZeroOne}

@MOIU.model JuMPMOIModel (ZeroOne, Integer) (EqualTo, GreaterThan, LessThan, Interval) (Zeros, Nonnegatives, Nonpositives, SecondOrderCone, RotatedSecondOrderCone, GeometricMeanCone, PositiveSemidefiniteConeTriangle, PositiveSemidefiniteConeSquare, RootDetConeTriangle, RootDetConeSquare, LogDetConeTriangle, LogDetConeSquare) () (SingleVariable,) (ScalarAffineFunction,ScalarQuadraticFunction) (VectorOfVariables,) (VectorAffineFunction,)

struct Factory
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OptimizerFactory maybe?
Add a comment describing the purpose. It's a user-friendly closure.

src/JuMP.jl Outdated
optimizer=nothing,
bridge_constraints=true)
# Inner constructor
function Model(factory::Union{Nothing, Factory}, moibackend::MOI.ModelLike)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this still need to be an inner constructor?

setoptimizer(model, with_optimizer(IpoptOptimizer, print_level=0))
```
"""
function setoptimizer(model::Model, factory::Factory)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We really need to decide on a convention for underscores. with_optimizer and setoptimizer are inconsistent.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with using underscores, it's a typo :)

src/JuMP.jl Outdated
# In Manual and Automatic mode: Factory used to create the optimizer or
# Nothing if it has not already been set
# In Direct mode: Nothing
factory::Union{Nothing, Factory}
# In Manual and Automatic modes, LazyBridgeOptimizer{CachingOptimizer}.
# In Direct mode, will hold an AbstractOptimizer.
moibackend::MOI.AbstractOptimizer
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could this be an MOI.ModelLike to support non-optimizer backends?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I don't remember which PR removed that possibility. The model of the CachingOptimizer can be a MOI.ModelLike and CachingOptimizer will still be an AbstractOptimizer so this just removes the possibility to use a non-optimizer in direct mode. Do we want to allow non-optimizer in direct mode ?

@mlubin
Copy link
Member

mlubin commented Jul 29, 2018

"backend" isn't an MOI concept. It's usage needs to be defined in the docs. It's called a backend instead of the optimizer because:

  • In Manual and Automatic modes, it's not the same optimizer as what the user provided.
  • I imagine that it could not support optimize!, and just be a model container, e.g., backed by an MPS file for example.

TODO: Describe the connection between JuMP and solvers. Automatic vs. Manual
mode. CachingOptimizer. How to set/change solvers. How to set parameters (solver
A JuMP model keeps a
[MathOptInterface (MOI)](https://github.com/JuliaOpt/MathOptInterface.jl)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Style: lines are allowed to go over 80 chars if they end in a long URL

set_optimizer(model, with_optimizer(IpoptOptimizer, print_level=0))
```
"""
function set_optimizer(model::Model, factory::OptimizerFactory)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this need to check if what mode the solver is in? If I'm in direct mode I can't switch right?

mode. CachingOptimizer. How to set/change solvers. How to set parameters (solver
A JuMP model keeps a
[MathOptInterface (MOI)](https://github.com/JuliaOpt/MathOptInterface.jl)
backend internally that stores the optimization problem and act as the
Copy link
Member

@odow odow Jul 29, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is backend mentioned above somewhere? It might need to be defined. (Edit: should have read the above comments better.)

s/and act/and acts

In Automatic and Manual modes, two MOI layers are automatically applied to the
optimizer:

* `CachingOptimizer`: it maintain a cache of the model so that when the
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

s/it maintain/it maintains or s/it maintain/maintains

TODO: Describe the connection between JuMP and solvers. Automatic vs. Manual
mode. CachingOptimizer. How to set/change solvers. How to set parameters (solver
A JuMP model keeps a [MathOptInterface (MOI)](https://github.com/JuliaOpt/MathOptInterface.jl)
*backend* internally that stores the optimization problem and acts as the
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Say this is an MOI.ModelLike.

modifications are directly applied to the MOI backend thus expecting the
backend to support such modifications.

While this allows JuMP API to to be a thin wrapper on top of the solver API,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: "JuMP API" -> "JuMP" or "the JuMP API"

model can be discarded and restored from the cache just before optimization.
The `CachingOptimizer` has two different modes: Automatic and Manual
corresponding to the two JuMP modes with the same names.
* `LazyBridgeOptimizer`: when a constraint added is not supported by the
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mention that this can be disabled using the bridge_constraints keyword argument to Model.


To create a fresh new JuMP model (or a fresh new copy of a JuMP model), JuMP
needs to create a new empty optimizer instance. New optimizer instances can
be obtained using a factory that can be created using the
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd change "factory" to OptimizerFactory.

src/JuMP.jl Outdated

User-friendly closure that creates new MOI models. New `OptimizerFactory`s are
created with [`with_optimizer`](@ref) and new models are created from the
factory with [`create_model`](@ref).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Example usage would help a lot here.

src/JuMP.jl Outdated
Model(; caching_mode::MOIU.CachingOptimizerMode=MOIU.Automatic,
bridge_constraints::Bool=true)

Return a new JuMP model without any optimizer storing the model in a cache.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"without any optimizer storing the model in a cache" or just "without any optimizer"?

src/JuMP.jl Outdated
Return a new JuMP model using `backend` to store the model and solve it. As
opposed to the [`Model`](@ref) constructor, no cache of the model is stored
outside of `backend` and no bridges are automatically applied to `backend`.
The absence of cache reduces the memory footprint but it is importnat to bear
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typo: "important"

src/JuMP.jl Outdated
* When `backend` does not support an operation such as adding
variables/constraints after solver or modifying constraints, an error is
thrown. With models created using the [`Model`](@ref) constructor, such
situations can be dealt with modifying the cache only and copying the model
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"such situations can be dealt with modifying the cache only and copying the model cache once JuMP.optimize is called." -> "such situations are dealt with by storing the modifications in a cache and loading them into the optimizer when JuMP.optimize is called".

src/JuMP.jl Outdated
thrown. With models created using the [`Model`](@ref) constructor, such
situations can be dealt with modifying the cache only and copying the model
cache once `JuMP.optimize` is called.
* When `backend` does not support a constraint type, the constraint is not
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

More clearly, "no constraint bridging is supported by default".

src/JuMP.jl Outdated
cache once `JuMP.optimize` is called.
* When `backend` does not support a constraint type, the constraint is not
automatically bridged to constraints supported by `backend`.
* The optimizer used cannot be changed. With models created using the
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The explanation past "The optimizer used cannot be changed" is too verbose. I would just say "The optimizer cannot be changed after the Model is constructed."

src/JuMP.jl Outdated
`IpoptOptimizer`s:
```julia
factory = with_optimizer(IpoptOptimizer, print_level=0)
optimizer1 = JuMP.create_model(factory)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd expect JuMP.create_model to create a JuMP model. This is very confusing. What about overloading the call operator on the factory? Like factory() to create an optimizer.

ignore_optimize_hook=(model.optimizehook===nothing))
"""
function optimize(model::Model,
factory::Union{Nothing, OptimizerFactory} = nothing;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's better to call it optimizer_factory (in code) and "optimizer factory" (in text). factory is too generic.

MOI.set!(mocksolver, MOI.ConstraintDual(), JuMP.optimizerindex(JuMP.LowerBoundRef(y)), 1.0)

JuMP.optimize(m)
JuMP.optimize(m, with_optimizer(MOIU.MockOptimizer, JuMP.JuMPMOIModel{Float64}(), evalobjective=false))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like how this looks.

See the [MOI documentation](http://www.juliaopt.org/MathOptInterface.jl/stable/)
for more details on these two MOI layers.

To create a fresh new JuMP model, JuMP needs to create a new empty optimizer
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"JuMP needs to" isn't correct. You can create a JuMP model without an optimizer.

with_optimizer
```

The optimizer factory can be set to the JuMP model in the
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This no longer involves setting, right? Maybe say that the factory can be provided either at model construction time or at optimize time.

# The NLPData is not kept in sync, so re-set it here.
# TODO: Consider how to handle incremental solves.
if model.nlpdata !== nothing
MOI.set!(model, MOI.NLPBlock(), create_nlp_block_data(model))
empty!(model.nlpdata.nlconstr_duals)
end

if optimizer_factory !== nothing
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We may want to consider throwing an error in this case if the caching optimizer is not in the NoOptimizer state. It seems confusing to write:

model = Model(with_optimizer(Gurobi.Optimizer))
...
JuMP.optimize(model, with_optimizer(CPLEX.Optimizer))

Also error if we're in direct mode. (MOIU.resetoptimizer! will error, but the user didn't call this function.)

[`JuMP.optimize`](@ref) function:
```@docs
JuMP.optimize
```
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're completely missing a discussion about how to control the caching optimizer states. This can be done later, but we should have a TODO.

@@ -49,6 +49,12 @@ function optimize(model::Model,
end

if optimizer_factory !== nothing
if mode(model) == Direct
error("An optimizer factory cannot be provided at the `optimize` call in Direct mode")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: period at the end of the error (as below).

@@ -90,6 +90,7 @@
MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(JuMP.UpperBoundRef(x)), 0.0)
MOI.set!(mockoptimizer, MOI.ConstraintDual(), JuMP.optimizerindex(JuMP.LowerBoundRef(y)), 1.0)

@test_throws ErrorException JuMP.optimize(m, with_optimizer(MOIU.MockOptimizer, JuMP.JuMPMOIModel{Float64}()))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's better to add a separate small test for this (also below).

@mlubin mlubin merged commit 0daf405 into master Aug 2, 2018
@mlubin mlubin deleted the bl/factory branch August 2, 2018 20:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

4 participants