Skip to content

mt-mithril/Alpine.jl

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Alpine, A global solver for nonconvex MINLPs

STATUS: CI codecov Documentation

"ALPINE: glob(AL) o(P)timization for mixed-(I)nteger programs with (N)onlinear (E)quations", is a novel global optimization solver that uses an adaptive, piecewise convexification scheme and constraint programming methods to solve non-convex Mixed-Integer Non-Linear Programs (MINLPs) efficiently. MINLPs are typically "hard" optimization problems which appear in numerous applications (see MINLPLib.jl).

Alpine is entirely built upon JuMP and MathOptInterface in Julia, which provides incredible flexibility for usage and further development.

Alpine globally solves a given MINLP by:

  • Analyzing the problem's expressions (objective & constraints) and applies approporite convex relaxations and polyhedral outer-approximations

  • Performing sequential optimization-based bound tightening and an iterative adaptive partitioning scheme to piecewise polyhedral relaxations with a guarantee of global convergence

Allowable nonlinearities: Alpine can currently handle MINLPs with polynomials in constraints and/or in the objective. Currently, there is no support for exponential cones and Positive Semi-Definite (PSD) cones in MINLPs. Alpine is also a good fit for subsets of the MINLP family, e.g., Mixed-Integer Quadratically Constrainted Quadradic Programs (MIQCQPs), Non-Linear Programs (NLPs), etc.

For more details, check out this video on Alpine.jl at the 2nd Annual JuMP-dev Workshop, held at the Institut de Mathématiques de Bordeaux, June 2018.

Installation and Usage

Alpine can be installed through the Julia package manager:

julia> Pkg.add("Alpine")

Developers: Any further development of Alpine can be conducted on a new branch or a forked repo.

Check the "test/examples" folder on how to use this package.

Underlying solvers

Though the algorithm implemented in Alpine is quite involved, most of the computational bottleneck arises in the underlying MIP solvers. Since every iteration of Alpine solves a subproblem to optimality, which is typically a convex MILP/MIQCQP, Alpine's run time heavily depends on the run-time of these solvers. For best performance of Alpine, use commercial solvers such as CPLEX/Gurobi. However, due to the flexibility offered by JuMP.jl, the following solvers are supported in Alpine:

Solver Julia Package
CPLEX CPLEX.jl
Cbc Cbc.jl
Gurobi Gurobi.jl
Ipopt Ipopt.jl
Bonmin Bonmin.jl
Artelys KNITRO KNITRO.jl
Xpress Xpress.jl

Bug reports and support

Please report any issues via the Github issue tracker. All types of issues are welcome and encouraged; this includes bug reports, documentation typos, feature requests, etc.

Challenging Problems

We are seeking out hard benchmark instances for MINLPs. Please get in touch either by opening an issue or privately if you would like to share any hard instances.

Citing Alpine

If you find Alpine useful in your work, we kindly request that you cite the following papers (pdf, pdf)

@article{NagarajanLuWangBentSundar2019,
  author = {Nagarajan, Harsha and Lu, Mowen and Wang, Site and Bent, Russell and Sundar, Kaarthik},
  title = {An adaptive, multivariate partitioning algorithm for global optimization of nonconvex programs},
  journal = {Journal of Global Optimization},
  year = {2019},
  issn = {1573-2916},
  doi = {10.1007/s10898-018-00734-1},
}

@inproceedings{NagarajanLuYamangilBent2016,
  title = {Tightening {McC}ormick relaxations for nonlinear programs via dynamic multivariate partitioning},
  author = {Nagarajan, Harsha and Lu, Mowen and Yamangil, Emre and Bent, Russell},
  booktitle = {International Conference on Principles and Practice of Constraint Programming},
  pages = {369--387},
  year = {2016},
  organization = {Springer},
  doi = {10.1007/978-3-319-44953-1_24},
}

About

A Julia/JuMP-based Global Optimization Solver for Non-convex Programs

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Julia 100.0%