Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quadratic constraints #102

Draft
wants to merge 16 commits into
base: main
Choose a base branch
from

Conversation

amontoison
Copy link
Member

@github-actions
Copy link
Contributor

Package name latest stable
NLSProblems.jl
OptimizationProblems.jl

@github-actions
Copy link
Contributor

github-actions bot commented May 6, 2022

Package name latest stable
NLSProblems.jl
OptimizationProblems.jl

@codecov
Copy link

codecov bot commented May 6, 2022

Codecov Report

Merging #102 (3d7001c) into main (bb093af) will increase coverage by 0.07%.
The diff coverage is 96.21%.

❗ Current head 3d7001c differs from pull request most recent head e141a9c. Consider uploading reports for the commit e141a9c to get more accurate results

@@            Coverage Diff             @@
##             main     #102      +/-   ##
==========================================
+ Coverage   95.00%   95.07%   +0.07%     
==========================================
  Files           3        3              
  Lines         580      690     +110     
==========================================
+ Hits          551      656     +105     
- Misses         29       34       +5     
Impacted Files Coverage Δ
src/utils.jl 97.23% <93.24%> (-1.67%) ⬇️
src/moi_nlp_model.jl 100.00% <100.00%> (ø)
src/moi_nls_model.jl 88.36% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update bb093af...e141a9c. Read the comment docs.

@github-actions
Copy link
Contributor

github-actions bot commented May 6, 2022

Package name latest stable
NLSProblems.jl
OptimizationProblems.jl

@tmigot
Copy link
Member

tmigot commented May 6, 2022

Hi @amontoison , I made some progress with this one. A list of things that we can discuss:

Right now my test was very basic, just testing that all the API returns something.

jump = hs61()
nlp = MathOptNLPModel(jump)
x1 = rand(nlp.meta.nvar)

obj(nlp, x1)
cons(nlp, x1)
jac_structure(nlp)
jac_coord(nlp, x1)
hess(nlp, x1)
hess(nlp, x1, rand(nlp.meta.ncon))
hprod(nlp, x1, x1)

@github-actions
Copy link
Contributor

github-actions bot commented May 6, 2022

Package name latest stable
NLSProblems.jl
OptimizationProblems.jl

@tmigot
Copy link
Member

tmigot commented May 6, 2022

We can also use the following script to test problems with quadratic constraints from OptimizationProblems.jl

using ADNLPModels, OptimizationProblems, NLPModels

"""
    test_quadratic_constraints(sample_size = 100)

Return the list of problems with quadratic constraints
"""
function test_quadratic_constraints(sample_size = 10)
  meta = OptimizationProblems.meta[!, :]
  con_pb = meta[meta.ncon .> 0, :name]
  sample_size = max(sample_size, 2)

  list = []
  for pb in con_pb
    nlp = OptimizationProblems.ADNLPProblems.eval(Symbol(pb))()
    std = similar(nlp.meta.x0)
    blvar = similar(nlp.meta.lvar)
    buvar = similar(nlp.meta.uvar)
    for j=1:nlp.meta.nvar
      blvar[j] = nlp.meta.lvar[j] == -Inf ? -10. : nlp.meta.lvar[j]
      buvar[j] = nlp.meta.uvar[j] == Inf ? 10. : nlp.meta.uvar[j]
      std[j] = max(abs(blvar[j]), abs(buvar[j]))
    end
    Iref = collect(1:nlp.meta.ncon)
    for k in Iref
      if k in nlp.meta.lin # substract linear constraints
        Iref[k] = -1
        continue
      end
      y0 = zeros(nlp.meta.ncon)
      y0[k] = 1.0
      ref = hess(nlp, nlp.meta.x0, y0, obj_weight = 0.0)
      for i=1:sample_size
        x = min.(max.((2 * rand(nlp.meta.nvar) .- 1) .* std, blvar), buvar)
        Hx = hess(nlp, x, y0, obj_weight = 0.0)
        if Hx != ref
          Iref[k] = -1
          break
        end
      end
    end
    if findall(x -> x > 0, Iref) != []
      push!(list, (nlp.meta.name, Iref[findall(x -> x > 0, Iref)]))
    end
  end
  return list
end

test_quadratic_constraints()

@github-actions
Copy link
Contributor

Package name latest stable
NLSProblems.jl
OptimizationProblems.jl

@github-actions
Copy link
Contributor

Package name latest stable
NLSProblems.jl
OptimizationProblems.jl

@tmigot
Copy link
Member

tmigot commented May 11, 2022

@amontoison I am checking with the other problems in OptimizationProblems.jl, but it looks good to me.

@github-actions
Copy link
Contributor

Package name latest stable
NLSProblems.jl
OptimizationProblems.jl

@tmigot
Copy link
Member

tmigot commented May 12, 2022

I am done testing the OptimizationProblems.jl ; I added 45 problems with quadratic constraints https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl/tree/add-quadratic and no error found! So, it's good for me. Up to you.

@amontoison amontoison changed the title [WIP] Quadratic constraints Quadratic constraints May 12, 2022
Copy link
Member Author

@amontoison amontoison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Major modifications of NLPModels.hess_coord! are required.

test/nlp_problems/hs100.jl Outdated Show resolved Hide resolved
for j=1:length(qcon.vec)
vals[k + j] = qcon.b[qcon.vec[j]]
end
nnzj = length(qcon.hessian.vals)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nnzj should be the number of nnz of Qᵢx + bᵢ (length(set)) and not Qᵢ.

src/moi_nlp_model.jl Outdated Show resolved Hide resolved
src/moi_nlp_model.jl Show resolved Hide resolved
for i = 1:(nlp.quadcon.nquad)
qcon = nlp.quadcon[i]
nnzh = length(qcon.hessian.vals)
vals[(k + 1):(k + nnzh)] .= qcon.hessian.vals .* y[nlp.meta.nlin + i]
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We want to avoid that with the hessian_quad function!
If we have 10 quadratic constraints Q_i with same structure, you will store 10 times the required number of nnz for nothing.

@@ -354,6 +394,7 @@ function NLPModels.hess_coord!(
vals[(nlp.obj.nnzh + 1):(nlp.meta.nnzh)] .= 0.0
end
if nlp.obj.type == "NONLINEAR"
vals .= 0.0
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MOI.eval_hessian_lagrangian doesn't overwrite vals?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If vals .= 0.0 is required, we should also add it in the other NLPModels.hess_coord! function.

src/utils.jl Outdated Show resolved Hide resolved
src/moi_nlp_model.jl Outdated Show resolved Hide resolved
src/moi_nlp_model.jl Outdated Show resolved Hide resolved
@tmigot tmigot self-requested a review May 12, 2022 22:59
@github-actions
Copy link
Contributor

Package name latest stable
NLSProblems.jl
OptimizationProblems.jl

@github-actions
Copy link
Contributor

Package name latest stable
NLSProblems.jl
OptimizationProblems.jl

@github-actions
Copy link
Contributor

Package name latest stable
NLSProblems.jl
OptimizationProblems.jl

@github-actions
Copy link
Contributor

Package name latest stable
NLSProblems.jl
OptimizationProblems.jl

@github-actions
Copy link
Contributor

Package name latest stable
NLSProblems.jl
OptimizationProblems.jl

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants