DiffOpt is a package for differentiating convex optimization programs with respect to the program parameters. It currently supports linear, quadratic and conic programs. Refer to the documentation for examples. Powered by JuMP.jl, DiffOpt allows creating a differentiable optimization model from many existing optimizers.
DiffOpt can be installed via the Julia package manager:
(v1.3) pkg> add https://github.com/jump-dev/DiffOpt.jl
- Create a model using the wrapper.
using JuMP, DiffOpt, Clp
model = JuMP.Model(() -> diff_optimizer(Clp.Optimizer))
- Define your model and solve it a single line.
@variable(model, x)
@constraint(
model,
cons,
x >= 3,
)
@objective(
model,
Min,
2x,
)
optimize!(model) # solve
- Choose the problem parameters to differentiate with and set their perturbations.
MOI.set.( # set pertubations / gradient inputs
model,
DiffOpt.BackwardInVariablePrimal(),
x,
1.0,
)
- Differentiate the model (primal, dual variables specifically) and fetch the gradients
DiffOpt.backward(model) # differentiate
grad_exp = MOI.get( # -3x+1
model,
DiffOpt.BackwardOutConstraint(),
cons
)
JuMP.constant(grad_exp) # 1
JuMP.coefficient(grad_exp, x) # -3
- DiffOpt began as a NumFOCUS sponsored Google Summer of Code (2020) project