diff --git a/.travis.yml b/.travis.yml index 849e45c3442..f9582d14877 100644 --- a/.travis.yml +++ b/.travis.yml @@ -20,6 +20,6 @@ addons: - libblas-dev after_success: - echo $TRAVIS_JULIA_VERSION - - julia -e 'Pkg.add("Coverage"); cd(Pkg.dir("JuMP")); using Coverage; Coveralls.submit(process_folder()); Codecov.submit(process_folder())' - - julia -e 'Pkg.add("Documenter")' - - julia -e 'cd(Pkg.dir("JuMP")); include(joinpath("docs", "make.jl"))' + - julia -e '(VERSION >= v"0.7" && using Pkg); Pkg.add("Coverage"); cd(Pkg.dir("JuMP")); using Coverage; Coveralls.submit(process_folder()); Codecov.submit(process_folder())' + - julia -e '(VERSION >= v"0.7" && using Pkg); Pkg.add("Documenter")' + - julia -e 'VERSION == v"1.0" && (using Pkg; cd(Pkg.dir("JuMP")); include(joinpath("docs", "make.jl")))' diff --git a/docs/make.jl b/docs/make.jl index 45c47540d32..08de87a6503 100644 --- a/docs/make.jl +++ b/docs/make.jl @@ -32,7 +32,7 @@ deploydocs( repo = "github.com/JuliaOpt/JuMP.jl.git", target = "build", osname = "linux", - julia = "0.6", + julia = "1.0", deps = nothing, make = nothing ) diff --git a/docs/src/quickstart.md b/docs/src/quickstart.md index 81d383abac0..de0dbae241b 100644 --- a/docs/src/quickstart.md +++ b/docs/src/quickstart.md @@ -1,4 +1,182 @@ Quick Start Guide ================= -TODO: Quick example of solving an LP and getting the solution back. +This quick start guide will introduce the main concepts of JuMP. If you are +familiar with another modeling language embedded in a high-level language such +as PuLP (Python) or a solver-specific interface you will find most of this +familiar. If you are coming from an AMPL or similar background, you may find +some of the concepts novel but the general appearance will still be familiar. + +The example in this guide is deliberately kept simple. There are more complex +examples in the [`JuMP/examples/` folder](https://github.com/JuliaOpt/JuMP.jl/tree/master/examples). + +Once JuMP is installed, to use JuMP in your programs, you just need to say: +```jldoctest quickstart_example +julia> using JuMP +``` + +You also need to include a Julia package which provides an appropriate solver. +One such solver is `GLPK.Optimizer`, which is provided by the +[GLPK.jl package](https://github.com/JuliaOpt/GLPK.jl). +```julia +julia> using GLPK +``` +See [Installation Guide](@ref) for a list of other solvers you can use. + +Models are created with the `Model()` function. The `with_optimizer` syntax is +used to specify the optimizer to be used: +```julia +julia> model = Model(with_optimizer(GLPK.Optimizer)) +A JuMP Model +``` + +```@meta +DocTestSetup = quote + # Using a caching optimizer removes the need to # load a solver such as GLPK + # for building the documentation. + const MOI = JuMP.MathOptInterface + model = Model(with_optimizer(MOI.Utilities.MockOptimizer, + JuMP.JuMPMOIModel{Float64}(), + eval_objective_value = false, + eval_variable_constraint_dual = false)) +end +``` +!!! note + Your model doesn't have to be called `model` - it's just a name. + +There are a few options for defining a variable, depending on whether you want +to have lower bounds, upper bounds, both bounds, or even no bounds. The +following commands will create two variables, `x` and `y`, with both lower and +upper bounds. Note the first argument is our model variable ``model``. These +variables are associated with this model and cannot be used in another model. +```jldoctest quickstart_example +julia> @variable(model, 0 <= x <= 2) +x + +julia> @variable(model, 0 <= y <= 30) +y +``` +See the [Variables](@ref) section for more information on creating variables. + +```@meta +DocTestSetup = nothing +``` + +Next we'll set our objective. Note again the `model`, so we know which model's +objective we are setting! The objective sense, `Max` or `Min`, should be +provided as the second argument. Note also that we don't have a multiplication +`*` symbol between `5` and our variable `x` - Julia is smart enough to not need +it! Feel free to stick with `*` if it makes you feel more comfortable, as we +have done with `3 * y`. (We have been intentionally inconsistent here to demonstrate different syntax; however, it is good practice to pick one way or the other consistently in your code.) +```jldoctest quickstart_example +julia> @objective(model, Max, 5x + 3 * y) +``` + +Adding constraints is a lot like setting the objective. Here we create a +less-than-or-equal-to constraint using `<=`, but we can also create equality +constraints using `==` and greater-than-or-equal-to constraints with `>=`: +```jldoctest quickstart_example; filter=r"≤|<=" +julia> @constraint(model, con, 1x + 5y <= 3) +x + 5 y <= 3.0 +``` +Note that in a similar manner to the `@variable` macro, we have named the +constraint `con`. This will bind the constraint to the Julia variable `con` for +later analysis. + +Models are solved with the `JuMP.optimize!` function: +```jldoctest quickstart_example +julia> JuMP.optimize!(model) +``` + +```@meta +DocTestSetup = quote + # Now we load in the solution. Using a caching optimizer removes the need to + # load a solver such as GLPK for building the documentation. + mock = JuMP.caching_optimizer(model).optimizer + MOI.set(mock, MOI.TerminationStatus(), MOI.Success) + MOI.set(mock, MOI.PrimalStatus(), MOI.FeasiblePoint) + MOI.set(mock, MOI.DualStatus(), MOI.FeasiblePoint) + MOI.set(mock, MOI.ResultCount(), 1) + MOI.set(mock, MOI.ObjectiveValue(), 10.6) + MOI.set(mock, MOI.VariablePrimal(), JuMP.optimizer_index(x), 2.0) + MOI.set(mock, MOI.VariablePrimal(), JuMP.optimizer_index(y), 0.2) + MOI.set(mock, MOI.ConstraintDual(), JuMP.optimizer_index(con), -0.6) + MOI.set(mock, MOI.ConstraintDual(), JuMP.optimizer_index(JuMP.UpperBoundRef(x)), -4.4) + MOI.set(mock, MOI.ConstraintDual(), JuMP.optimizer_index(JuMP.LowerBoundRef(y)), 0.0) +end +``` + +After the call to `JuMP.optimize!` has finished, we need to understand why the +optimizer stopped. This can be for a number of reasons. First, the solver might +have found the optimal solution, or proved that the problem is infeasible. +However, it might also have run into numerical difficulties, or terminated due +to a setting such as a time limit. We can ask the solver why it stopped using +the `JuMP.termination_status` function: +```jldoctest quickstart_example +julia> JuMP.termination_status(model) +Success::TerminationStatusCode = 0 +``` +In this case, `GLPK` returned `Success`. This does *not* mean that it has found +the optimal solution. Instead, it indicates that GLPK has finished running and +did not encounter any errors or user-provided termination limits. + +```@meta +DocTestSetup = nothing +``` + +To understand the reason for termination in more detail, we need to query +`JuMP.primalstatus`: +```jldoctest quickstart_example +julia> JuMP.primal_status(model) +FeasiblePoint::ResultStatusCode = 1 +``` +This indicates that GLPK has found a `FeasiblePoint` to the primal problem. +Coupled with the `Success` from `JuMP.termination_status`, we can infer that GLPK +has indeed found the optimal solution. We can also query `JuMP.dual_status`: +```jldoctest quickstart_example +julia> JuMP.dual_status(model) +FeasiblePoint::ResultStatusCode = 1 +``` +Like the `primal_status`, GLPK indicates that it has found a `FeasiblePoint` to +the dual problem. + +Finally, we can query the result of the optimization. First, we can query the +objective value: +```jldoctest quickstart_example +julia> JuMP.objective_value(model) +10.6 +``` +We can also query the primal result values of the `x` and `y` variables: +```jldoctest quickstart_example +julia> JuMP.result_value(x) +2.0 + +julia> JuMP.result_value(y) +0.2 +``` + +We can also query the value of the dual variable associated with the constraint +`con` (which we bound to a Julia variable when defining the constraint): +```jldoctest quickstart_example +julia> JuMP.result_dual(con) +-0.6 +``` + +To query the dual variables associated with the variable bounds, things are a +little trickier as we first need to obtain a reference to the constraint: +```jldoctest quickstart_example; filter=r"≤|<=" +julia> x_upper = JuMP.UpperBoundRef(x) +x <= 2.0 + +julia> JuMP.result_dual(x_upper) +-4.4 +``` +A similar process can be followed to obtain the dual of the lower bound +constraint on `y`: +```jldoctest quickstart_example; filter=r"≥|>=" +julia> y_lower = JuMP.LowerBoundRef(y) +y >= 0.0 + +julia> JuMP.result_dual(y_lower) +0.0 +``` diff --git a/docs/src/variables.md b/docs/src/variables.md index 3456f515de1..e2a2e2431e7 100644 --- a/docs/src/variables.md +++ b/docs/src/variables.md @@ -30,7 +30,7 @@ julia> model = Model() A JuMP Model julia> @variable(model, x[1:2]) -2-element Array{JuMP.VariableRef,1}: +2-element Array{VariableRef,1}: x[1] x[2] ``` @@ -44,6 +44,9 @@ This code does three things: To reduce confusion, we will attempt, where possible, to always refer to variables with their corresponding prefix. +!!! warn + Creating two JuMP variables with the same name results in an error at runtime. + JuMP variables can have attributes, such as names or an initial primal start value. We illustrate the name attribute in the following example: ```jldoctest variables @@ -65,8 +68,8 @@ julia> y decision variable ``` -Because `y` is a Julia variable, I can bind it to a different value. For example, -if I go: +Because `y` is a Julia variable, we can bind it to a different value. For +example, if we write: ```jldoctest variables julia> y = 1 1 @@ -126,13 +129,13 @@ In the above examples, `x_free` represents an unbounded optimization variable, !!! note When creating a variable with only a lower-bound or an upper-bound, and the value of the bound is not a numeric literal, the name must appear on the - left-hand side. Putting the name on the right-hand side (e.g., `a=1`, - `@variable(model, a <= x)`) will result in an error. - - **Extra for experts:** the reason for this is that at compile time, JuMP - does not type and value information. Therefore, the case `@variable(model, - a <= b)` is ambiguous as JuMP cannot infer whether `a` is a constant and - `b` is the intended variable name, or vice-versa. + left-hand side. Putting the name on the right-hand side will result in an + error. For example: + ```julia + @variable(model, 1 <= x) # works + a = 1 + @variable(model, a <= x) # errors + ``` We can query whether an optimization variable has a lower- or upper-bound via the `JuMP.has_lower_bound` and `JuMP.has_upper_bound` functions. For example: @@ -165,9 +168,44 @@ julia> JuMP.lower_bound(x) 1.0 ``` -!!! warn - If you create two JuMP variables with the same name, an error will be - thrown. +Another option is to use the `JuMP.set_lower_bound` and `JuMP.set_upper_bound` +functions. These can also be used to modify an existing variable bound. For +example: +```jldoctest; setup=:(model=Model()) +julia> @variable(model, x >= 1) +x + +julia> JuMP.lower_bound(x) +1.0 + +julia> JuMP.set_lower_bound(x, 2) + +julia> JuMP.lower_bound(x) +2.0 +``` + +Finally, we can delete variable bounds using `JuMP.delete_lower_bound` and +`JuMP.delete_upper_bound`: +```jldoctest; setup=:(model=Model()) +julia> @variable(model, 1 <= x <= 2) +x + +julia> JuMP.lower_bound(x) +1.0 + +julia> JuMP.delete_lower_bound(x) + +julia> JuMP.has_lower_bound(x) +false + +julia> JuMP.upper_bound(x) +2.0 + +julia> JuMP.delete_upper_bound(x) + +julia> JuMP.has_upper_bound(x) +false +``` ## Variable containers @@ -187,7 +225,7 @@ We have already seen the creation of an array of JuMP variables with the arrays of JuMP variables. For example: ```jldoctest variables_arrays; setup=:(model=Model()) julia> @variable(model, x[1:2, 1:2]) -2×2 Array{JuMP.VariableRef,2}: +2×2 Array{VariableRef,2}: x[1,1] x[1,2] x[2,1] x[2,2] ``` @@ -198,7 +236,7 @@ julia> x[1, 2] x[1,2] julia> x[2, :] -2-element Array{JuMP.VariableRef,1}: +2-element Array{VariableRef,1}: x[2,1] x[2,2] ``` @@ -206,7 +244,7 @@ julia> x[2, :] We can also name each index, and variable bounds can depend upon the indices: ```jldoctest; setup=:(model=Model()) julia> @variable(model, x[i=1:2, j=1:2] >= 2i + j) -2×2 Array{JuMP.VariableRef,2}: +2×2 Array{VariableRef,2}: x[1,1] x[1,2] x[2,1] x[2,2] @@ -230,10 +268,10 @@ difference is that instead of returning an `Array` of JuMP variables, JuMP will return a `JuMPArray`. For example: ```jldoctest variables_jump_arrays; setup=:(model=Model()) julia> @variable(model, x[1:2, [:A,:B]]) -2-dimensional JuMPArray{JuMP.VariableRef,2,...} with index sets: +2-dimensional JuMPArray{VariableRef,2,...} with index sets: Dimension 1, 1:2 Dimension 2, Symbol[:A, :B] -And data, a 2×2 Array{JuMP.VariableRef,2}: +And data, a 2×2 Array{VariableRef,2}: x[1,A] x[1,B] x[2,A] x[2,B] ``` @@ -244,9 +282,9 @@ julia> x[1, :A] x[1,A] julia> x[2, :] -1-dimensional JuMPArray{JuMP.VariableRef,1,...} with index sets: +1-dimensional JuMPArray{VariableRef,1,...} with index sets: Dimension 1, Symbol[:A, :B] -And data, a 2-element Array{JuMP.VariableRef,1}: +And data, a 2-element Array{VariableRef,1}: x[2,A] x[2,B] ``` @@ -255,10 +293,10 @@ Similarly to the `Array` case, the indices in a `JuMPArray` can be named, and the bounds can depend upon these names. For example: ```jldoctest; setup=:(model=Model()) julia> @variable(model, x[i=2:3, j=1:2:3] >= 0.5i + j) -2-dimensional JuMPArray{JuMP.VariableRef,2,...} with index sets: +2-dimensional JuMPArray{VariableRef,2,...} with index sets: Dimension 1, 2:3 Dimension 2, 1:2:3 -And data, a 2×2 Array{JuMP.VariableRef,2}: +And data, a 2×2 Array{VariableRef,2}: x[2,1] x[2,3] x[3,1] x[3,3] @@ -279,7 +317,7 @@ rectangular set. One example is when indices have a dependence upon previous indices (called *triangular indexing*). JuMP supports this as follows: ```jldoctest; setup=:(model=Model()) julia> @variable(model, x[i=1:2, j=i:2]) -Dict{Any,JuMP.VariableRef} with 3 entries: +Dict{Any,VariableRef} with 3 entries: (1, 2) => x[1,2] (2, 2) => x[2,2] (1, 1) => x[1,1] @@ -291,7 +329,7 @@ sytax appends a comparison check that depends upon the named indices and is separated from the indices by a semi-colon (`;`). For example: ```jldoctest; setup=:(model=Model()) julia> @variable(model, x[i=1:4; mod(i, 2)==0]) -Dict{Any,JuMP.VariableRef} with 2 entries: +Dict{Any,VariableRef} with 2 entries: 4 => x[4] 2 => x[2] ``` @@ -308,9 +346,9 @@ julia> A = 1:2 1:2 julia> @variable(model, x[A]) -1-dimensional JuMPArray{JuMP.VariableRef,1,...} with index sets: +1-dimensional JuMPArray{VariableRef,1,...} with index sets: Dimension 1, 1:2 -And data, a 2-element Array{JuMP.VariableRef,1}: +And data, a 2-element Array{VariableRef,1}: x[1] x[2] ``` @@ -323,7 +361,7 @@ We can share our knowledge that it is possible to store these JuMP variables as an array by setting the `container` keyword: ```jldoctest variable_force_container julia> @variable(model, y[A], container=Array) -2-element Array{JuMP.VariableRef,1}: +2-element Array{VariableRef,1}: y[1] y[2] ``` @@ -387,7 +425,7 @@ matrix ``X`` is positive semidefinite if all eigenvalues are nonnegative. We can declare a matrix of JuMP variables to be positive semidefinite as follows: ```jldoctest; setup=:(model=Model()) julia> @variable(model, x[1:2, 1:2], PSD) -2×2 Symmetric{JuMP.VariableRef,Array{JuMP.VariableRef,2}}: +2×2 LinearAlgebra.Symmetric{VariableRef,Array{VariableRef,2}}: x[1,1] x[1,2] x[1,2] x[2,2] ``` @@ -400,7 +438,7 @@ You can also impose a slightly weaker constraint that the square matrix is only symmetric (instead of positive semidefinite) as follows: ```jldoctest; setup=:(model=Model()) julia> @variable(model, x[1:2, 1:2], Symmetric) -2×2 Symmetric{JuMP.VariableRef,Array{JuMP.VariableRef,2}}: +2×2 LinearAlgebra.Symmetric{VariableRef,Array{VariableRef,2}}: x[1,1] x[1,2] x[1,2] x[2,2] ``` @@ -424,7 +462,7 @@ x An `Array` of anonymous JuMP variables can be created as follows: ```jldoctest; setup=:(model=Model()) julia> y = @variable(model, [i=1:2]) -2-element Array{JuMP.VariableRef,1}: +2-element Array{VariableRef,1}: noname noname ``` @@ -441,7 +479,7 @@ use the `binary` and `integer` keywords. Thus, the anonymous variant of `@variable(model, x[i=1:2] >= i, Int)` is: ```jldoctest; setup=:(model=Model()) julia> x = @variable(model, [i=1:2], basename="x", lower_bound=i, integer=true) -2-element Array{JuMP.VariableRef,1}: +2-element Array{VariableRef,1}: x[1] x[2] ``` @@ -454,26 +492,25 @@ containers. However, users are also free to create collections of JuMP variables in their own datastructures. For example, the following code creates a dictionary with symmetric matrices as the values: ```jldoctest; setup=:(model=Model()) -julia> variables = Dict{Symbol, Symmetric{JuMP.VariableRef, - Array{JuMP.VariableRef,2}}}() -Dict{Symbol,Symmetric{JuMP.VariableRef,Array{JuMP.VariableRef,2}}} with 0 entries +julia> variables = Dict{Symbol, Array{VariableRef,2}}() +Dict{Symbol,Array{VariableRef,2}} with 0 entries julia> for key in [:A, :B] - variables[key] = @variable(model, [1:2, 1:2], Symmetric) + global variables[key] = @variable(model, [1:2, 1:2]) end julia> variables -Dict{Symbol,Symmetric{JuMP.VariableRef,Array{JuMP.VariableRef,2}}} with 2 entries: - :A => JuMP.VariableRef[noname noname; noname noname] - :B => JuMP.VariableRef[noname noname; noname noname] +Dict{Symbol,Array{VariableRef,2}} with 2 entries: + :A => VariableRef[noname noname; noname noname] + :B => VariableRef[noname noname; noname noname] ``` ## Deleting variables -JuMP supports the deletion of optimization variables. To delete variables, we +JuMP supports the deletion of optimization variables. To delete variables, we can use the `JuMP.delete` method. We can also check whether `x` is a valid JuMP variable in `model` using the `JuMP.is_valid` method: -```jldoctest variables_delete +```jldoctest variables_delete; setup=:(model=Model()) julia> @variable(model, x) x