Skip to content

Differentiating convex optimization program w.r.t. program parameters

License

Notifications You must be signed in to change notification settings

vfegger/DiffOpt.jl

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DiffOpt.jl

Dev Build Status Coverage

DiffOpt is a package for differentiating convex optimization programs with respect to the program parameters. It currently supports linear, quadratic and conic programs. Refer to the documentation for examples. Powered by JuMP.jl, DiffOpt allows creating a differentiable optimization model from many existing optimizers.

Installation

DiffOpt can be installed via the Julia package manager:

(v1.3) pkg> add https://github.com/jump-dev/DiffOpt.jl

Example

  1. Create a model using the wrapper.
using JuMP, DiffOpt, Clp

model = JuMP.Model(() -> diff_optimizer(Clp.Optimizer))
  1. Define your model and solve it a single line.
@variable(model, x)
@constraint(
  model, 
  cons, 
  x >= 3,
)
@objective(
  model, 
  Min, 
  2x,
)

optimize!(model) # solve
  1. Choose the problem parameters to differentiate with and set their perturbations.
MOI.set.(  # set pertubations / gradient inputs
    model, 
    DiffOpt.BackwardInVariablePrimal(), 
    x, 
    1.0,
)
  1. Differentiate the model (primal, dual variables specifically) and fetch the gradients
DiffOpt.backward(model) # differentiate

grad_exp = MOI.get(   # -3x+1
    model, 
    DiffOpt.BackwardOutConstraint(), 
    cons
)
JuMP.constant(grad_exp)  # 1
JuMP.coefficient(grad_exp, x)  # -3

Note

About

Differentiating convex optimization program w.r.t. program parameters

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Julia 100.0%