Skip to content

Commit

Permalink
update README and documentation in preparation for JuMP/MOI merge
Browse files Browse the repository at this point in the history
  • Loading branch information
mlubin committed Nov 24, 2017
1 parent 366a6e0 commit f95e903
Show file tree
Hide file tree
Showing 11 changed files with 15 additions and 1,263 deletions.
16 changes: 8 additions & 8 deletions README.md
Expand Up @@ -37,19 +37,21 @@ Our documentation includes an installation guide, quick-start guide, and referen

[juliaopt-notebooks]: https://github.com/JuliaOpt/juliaopt-notebooks

**Read about the upcoming transition to MathOptInterface and breaking changes in JuMP 0.19 [here](https://discourse.julialang.org/t/mathoptinterface-and-upcoming-breaking-changes-in-jump-0-19).**

**Latest Release**: 0.18.0 (via ``Pkg.add``)
**Latest Release**: 0.18.0 (`release-0.18` branch)
* [Documentation](http://www.juliaopt.org/JuMP.jl/0.18/)
* [Examples](https://github.com/JuliaOpt/JuMP.jl/tree/release-0.18/examples)
* Testing status:
* TravisCI: [![Build Status](https://travis-ci.org/JuliaOpt/JuMP.jl.svg?branch=release-0.18)](https://travis-ci.org/JuliaOpt/JuMP.jl)
* PackageEvaluator:
[![JuMP](http://pkg.julialang.org/badges/JuMP_0.5.svg)](http://pkg.julialang.org/?pkg=JuMP&ver=0.5)
[![JuMP](http://pkg.julialang.org/badges/JuMP_0.6.svg)](http://pkg.julialang.org/?pkg=JuMP&ver=0.6)


**Development version**:
**Read about the upcoming transition to MathOptInterface (MOI) and breaking changes in JuMP 0.19 [here](https://discourse.julialang.org/t/mathoptinterface-and-upcoming-breaking-changes-in-jump-0-19).**

The `master` branch now includes these breaking changes, but **we do not yet recommend using this branch unless you are a JuMP developer or solver developer**.

**Development version** (`master` branch):
* [Documentation](http://www.juliaopt.org/JuMP.jl/latest/)
* [Examples](https://github.com/JuliaOpt/JuMP.jl/tree/master/examples)
* Testing status:
Expand Down Expand Up @@ -109,7 +111,7 @@ Please report any issues via the Github **[issue tracker]**. All types of issues

## Citing JuMP

If you find JuMP useful in your work, we kindly request that you cite the following [paper](http://dx.doi.org/10.1137/15M1020575):
If you find JuMP useful in your work, we kindly request that you cite the following paper ([pdf](https://mlubin.github.io/pdf/jump-sirev.pdf)):

@article{DunningHuchetteLubin2017,
author = {Iain Dunning and Joey Huchette and Miles Lubin},
Expand All @@ -122,8 +124,6 @@ If you find JuMP useful in your work, we kindly request that you cite the follow
doi = {10.1137/15M1020575},
}

A preprint of this paper is freely available on [arXiv](https://arxiv.org/abs/1508.01982).

For an earlier work where we presented a prototype implementation of JuMP, see [here](http://dx.doi.org/10.1287/ijoc.2014.0623):

@article{LubinDunningIJOC,
Expand All @@ -137,4 +137,4 @@ For an earlier work where we presented a prototype implementation of JuMP, see [
doi = {10.1287/ijoc.2014.0623},
}

A preprint of this paper is also [freely available](http://arxiv.org/abs/1312.1431).
A preprint of this paper is [freely available](http://arxiv.org/abs/1312.1431).
6 changes: 1 addition & 5 deletions docs/make.jl
Expand Up @@ -9,11 +9,7 @@ makedocs(
"Introduction" => "index.md",
"Installation Guide" => "installation.md",
"Quick Start Guide" => "quickstart.md",
"Models" => "refmodel.md",
"Variables" => "refvariable.md",
"Expressions and Constraints" => "refexpr.md",
"Problem Modification" => "probmod.md",
"Solver Callbacks" => "callbacks.md",
"Nonlinear Modeling" => "nlp.md"
]
)
Expand All @@ -25,4 +21,4 @@ deploydocs(
julia = "0.6",
deps = nothing,
make = nothing
)
)
392 changes: 0 additions & 392 deletions docs/src/callbacks.md

This file was deleted.

31 changes: 6 additions & 25 deletions docs/src/index.md
@@ -1,6 +1,9 @@
JuMP --- Julia for Mathematical Optimization
============================================

!!! warning
This documentation is for the development version of JuMP. JuMP is undergoing a [major transition](https://discourse.julialang.org/t/mathoptinterface-and-upcoming-breaking-changes-in-jump-0-19) to MathOptInterface, and the documentation has not yet been rewritten. We **do not** recommend using the development version unless you are a JuMP or solver developer.

[JuMP](https://github.com/JuliaOpt/JuMP.jl) is a domain-specific modeling language for [mathematical optimization](http://en.wikipedia.org/wiki/Mathematical_optimization) embedded in [Julia](http://julialang.org/). It currently supports a number of open-source and commercial solvers (see below) for a variety of problem classes, including **linear programming**, **mixed-integer programming**, **second-order conic programming**, **semidefinite programming**, and **nonlinear programming**. JuMP's features include:

- User friendliness
Expand All @@ -13,7 +16,7 @@ JuMP --- Julia for Mathematical Optimization
- JuMP uses a generic solver-independent interface provided by the [MathProgBase](https://github.com/mlubin/MathProgBase.jl) package, making it easy to change between a number of open-source and commercial optimization software packages ("solvers").
- Currently supported solvers include [Artelys Knitro](http://artelys.com/en/optimization-tools/knitro), [Bonmin](https://projects.coin-or.org/Bonmin), [Cbc](https://projects.coin-or.org/Cbc), [Clp](https://projects.coin-or.org/Clp), [Couenne](https://projects.coin-or.org/Couenne), [CPLEX](http://www-01.ibm.com/software/commerce/optimization/cplex-optimizer/), [ECOS](https://github.com/ifa-ethz/ecos), [FICO Xpress](http://www.fico.com/en/products/fico-xpress-optimization-suite), [GLPK](http://www.gnu.org/software/glpk/), [Gurobi](http://www.gurobi.com), [Ipopt](https://projects.coin-or.org/Ipopt), [MOSEK](http://www.mosek.com/), [NLopt](http://ab-initio.mit.edu/wiki/index.php/NLopt), and [SCS](https://github.com/cvxgrp/scs).
- Access to advanced algorithmic techniques
- Including efficient LP re-solves <probmod> and callbacks for mixed-integer programming <callbacks> which previously required using solver-specific and/or low-level C++ libraries.
- Including efficient LP re-solves which previously required using solver-specific and/or low-level C++ libraries.
- Ease of embedding
- JuMP itself is written purely in Julia. Solvers are the only binary dependencies.
- Being embedded in a general-purpose programming language makes it easy to solve optimization problems as part of a larger workflow (e.g., inside a simulation, behind a web server, or as a subproblem in a decomposition algorithm).
Expand All @@ -22,41 +25,21 @@ JuMP --- Julia for Mathematical Optimization

While neither Julia nor JuMP have reached version 1.0 yet, the releases are stable enough for everyday use and are being used in a number of research projects and neat applications by a growing community of users who are early adopters. JuMP remains under active development, and we welcome your feedback, suggestions, and bug reports.

Installing JuMP
---------------

If you are familiar with Julia you can get started quickly by using the package manager to install JuMP:

```julia
julia> Pkg.add("JuMP")
```

And a solver, e.g.:

```julia
julia> Pkg.add("Clp") # Will install Cbc as well
```

Then read the quick-start and/or see a simple-example. The subsequent sections detail the complete functionality of JuMP.

Contents
--------

```@contents
Pages = ["installation.md",
"quickstart.md",
"refmodel.md",
"refvariable.md",
"refexpr.md",
"probmod.md",
"callbacks.md",
"nlp.md"]
Depth = 2
```

### Citing JuMP

If you find JuMP useful in your work, we kindly request that you cite the following [paper](http://dx.doi.org/10.1137/15M1020575):
If you find JuMP useful in your work, we kindly request that you cite the following paper ([pdf](https://mlubin.github.io/pdf/jump-sirev.pdf)):

``` sourceCode
@article{DunningHuchetteLubin2017,
Expand All @@ -71,8 +54,6 @@ doi = {10.1137/15M1020575},
}
```

A preprint of this paper is freely available on [arXiv](http://arxiv.org/abs/1508.01982).

For an earlier work where we presented a prototype implementation of JuMP, see [here](http://dx.doi.org/10.1287/ijoc.2014.0623):

``` sourceCode
Expand All @@ -88,4 +69,4 @@ doi = {10.1287/ijoc.2014.0623},
}
```

A preprint of this paper is also [freely available](http://arxiv.org/abs/1312.1431).
A preprint of this paper is [freely available](http://arxiv.org/abs/1312.1431).
36 changes: 0 additions & 36 deletions docs/src/installation.md
Expand Up @@ -106,39 +106,3 @@ Requires a working installation of Xpress with an active license (it is possible
!!! warning
If you are using 64-bit Xpress, you must use 64-bit Julia (and similarly with 32-bit Xpress).

### GLPK

GLPK binaries are provided on OS X and Windows (32- and 64-bit) by default. On Linux, it will be compiled from source. Note that `GLPKSolverLP` should be used for continuous problems and `GLPKSolverMIP` for problems with integer variables. GLPK supports MIP callbacks but does not support "SOS" constraints.

### Gurobi

Requires a working installation of Gurobi with an activated license (free for academic use). Gurobi supports MIP callbacks and "SOS" constraints.

!!! warning
If you are using 64-bit Gurobi, you must use 64-bit Julia (and similarly with 32-bit Gurobi).

### Ipopt

Ipopt binaries are provided on OS X and Windows (32- and 64-bit) by default. On Linux, it will be compiled from source. The default installation of Ipopt uses the open-source MUMPS library for sparse linear algebra. Significant speedups can be obtained by manually compiling Ipopt to use proprietary sparse linear algebra libraries instead. Julia can be pointed to use a custom version of Ipopt; we suggest posting to the [julia-opt](https://groups.google.com/forum/#!forum/julia-opt) mailing list with your platform details for guidance on how to do this.

### MOSEK

Requires a license (free for academic use). Mosek does not support the MIP callbacks used in JuMP. For nonlinear optimization, Mosek supports only convex problems. The Mosek interface is maintained by the Mosek team. (Thanks!)

### NLopt

NLopt supports only nonlinear models. An algorithm must be specified as an option when using `NLoptSolver`. NLopt is not recommended for large-scale models, because it does not currently exploit sparsity of derivative matrices.

### SCS

SCS can be used by JuMP to solve LPs and SOCPs, and SDPs. SCS is a first order solver and has low accuracy (``10^{−4}``) by default; see the SCS.jl documentation for more information.

### COIN-OR Bonmin and Couenne

Binaries of Bonmin and Couenne are provided on OS X and Windows (32- and 64-bit) by the [CoinOptServices.jl](https://github.com/JuliaOpt/CoinOptServices.jl) package. On Linux, they will be compiled from source. Once installed, they can be called either via `.osil` files using `OsilBonminSolver` and `OsilCouenneSolver` from [CoinOptServices.jl](https://github.com/JuliaOpt/CoinOptServices.jl), or via `.nl` files using `BonminNLSolver` and `CouenneNLSolver` from [AmplNLWriter.jl](https://github.com/JackDunnNZ/AmplNLWriter.jl). We recommend using the `.nl` format option, which is currently more stable and has better performance for derivative computations. Since both Bonmin and Couenne use Ipopt for continuous subproblems, the same MUMPS sparse linear algebra performance caveat applies.

### Other AMPL-compatible solvers

Any other solver not listed above that can be called from [AMPL](http://ampl.com/products/solvers/all-solvers-for-ampl/) can be used by JuMP through the [AmplNLWriter.jl](https://github.com/JuliaOpt/AmplNLWriter.jl) package. The first argument to `AmplNLSolver` can be used to specify a solver executable name.

For example, [SCIP](http://scip.zib.de/) is a powerful noncommercial mixed-integer programming solver. To use SCIP within JuMP, you must first download and [compile SCIP with support for AMPL](http://zverovich.net/2012/08/07/using-scip-with-ampl.html). Then you may use `AmplNLSolver("/path/to/scipampl")` where `scipampl` is the executable produced from the compilation process.
77 changes: 0 additions & 77 deletions docs/src/probmod.md
@@ -1,80 +1,3 @@
Problem Modification
====================

It can be useful to modify models after they have been created and solved, for example when we are solving many similar models in succession or generating the model dynamically (e.g. column generation). Additionally it is sometimes desirable for the solver to re-start from the last solution to reduce running times for successive solves ("hot-start"). Where available, JuMP exposes this functionality.

Differences in Solvers
----------------------

Some solvers do not expose the ability to modify a model after creation - the model must be constructed from scratch each time. JuMP will use the ability to modify problems exposed by the solver if possible, and will still work even if the solver does not support this functionality by passing the complete problem to the solver every time.

Modifying variables
-------------------

As before, variables can be added using the `@variable` macro. To remove a variable, one can set the bounds on that variable to zero, e.g.:

```julia
setlowerbound(x, 0.0)
setupperbound(x, 0.0)
```

While bound updates are applied immediately in JuMP, variable bound changes are not transmitted to the solver until `solve` is called again.

To add variables that appear in existing constraints, e.g. in column generation, there is an alternative form of the `@variable` macro:

```julia
@variable(m, x, objective = objcoef, inconstraints = constrrefs, coefficients = values)
@variable(m, x >= lb, objective = objcoef, inconstraints = constrrefs, coefficients = values)
@variable(m, x <= ub, objective = objcoef, inconstraints = constrrefs, coefficients = values)
@variable(m, lb <= x <= ub, objective = objcoef, inconstraints = constrrefs, coefficients = values)
@variable(m, lb <= x <= ub, Int, objective = objcoef, inconstraints = constrrefs, coefficients = values) # Types are supported
```

where `objcoef` is the coefficient of the variable in the new problem, `constrrefs` is a vector of `ConstraintRef`, and `values` is a vector of numbers. To give an example, consider the following code snippet:

```julia
m = Model()
@variable(m, 0 <= x <= 1)
@variable(m, 0 <= y <= 1)
@objective(m, Max, 5x + 1y)
@constraint(m, con, x + y <= 1)
solve(m) # x = 1, y = 0
@variable(m, 0 <= z <= 1, objective = 10.0, inconstraints = [con], coefficients = [1.0])
# The constraint is now x + y + z <= 1
# The objective is now 5x + 1y + 10z
solve(m) # z = 1
```

In some situations you may be adding all variables in this way. To do so, first define a set of empty constraints, e.g. :

```julia
m = Model()
@constraint(m, con, 0 <= 1)
@objective(m, Max, 0)
@variable(m, 0 <= x <= 1, objective = 5, inconstraints = [con], coefficients = [1.0])
@variable(m, 0 <= y <= 1, objective = 1, inconstraints = [con], coefficients = [1.0])
@variable(m, 0 <= z <= 1, objective = 10, inconstraints = [con], coefficients = [1.0])
solve(m)
```

Modifying constraints
---------------------

JuMP does not currently support changing constraint coefficients. For less-than and greater-than constraints, the right-hand-side can be changed, e.g.:

```julia
@constraint(m, mycon, x + y <= 4)
solve(m)
JuMP.setRHS(mycon, 3) # Now x + y <= 3
solve(m) # Hot-start for LPs
```

Modifying the objective
-----------------------

To change the objective, simply call `@objective` again - the previous objective function and sense will be replaced.

Modifying nonlinear models
--------------------------

See nonlinear parameters [Nonlinear Parameters](@ref).

0 comments on commit f95e903

Please sign in to comment.