Skip to content

Commit

Permalink
Merge pull request #66 from jump-dev/add-gradient-cache
Browse files Browse the repository at this point in the history
add gradient cache in Optimizer
  • Loading branch information
matbesancon committed Feb 16, 2021
2 parents 0a8781e + 7941882 commit f2be16c
Show file tree
Hide file tree
Showing 5 changed files with 359 additions and 86 deletions.
4 changes: 2 additions & 2 deletions examples/solve-QP.jl
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ using MathOptInterface
const MOI = MathOptInterface
const MOIU = MathOptInterface.Utilities;

using OSQP
using Ipopt

n = 20 # variable dimension
m = 15 # no of inequality constraints
Expand All @@ -17,7 +17,7 @@ q = rand(n)
G = rand(m, n)
h = G *+ rand(m);

model = MOI.instantiate(OSQP.Optimizer, with_bridge_type=Float64)
model = MOI.instantiate(Ipopt.Optimizer, with_bridge_type=Float64)
x = MOI.add_variables(model, n);

# define objective
Expand Down
Loading

0 comments on commit f2be16c

Please sign in to comment.