Skip to content

Conversation

@tmigot
Copy link
Member

@tmigot tmigot commented Jun 4, 2021

Follows NLPModels.jl PR #353 with parametric meta.
Add tests on SimpleNLPModel and SimpleNLSModel for Float64 and Float32.

@geoffroyleconte
Copy link
Member

@tmigot I think I registered here : JuliaSmoothOptimizers/NLPModels.jl@e3b8b73 , will this fix your tests?

@tmigot
Copy link
Member Author

tmigot commented Jun 5, 2021

@tmigot I think I registered here : JuliaSmoothOptimizers/NLPModels.jl@e3b8b73 , will this fix your tests?

Great, thank you! I will wait for the official release and then rebase.

@codecov
Copy link

codecov bot commented Jun 5, 2021

Codecov Report

Merging #12 (21d1f19) into master (27ecdc5) will increase coverage by 0.11%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #12      +/-   ##
==========================================
+ Coverage   97.92%   98.03%   +0.11%     
==========================================
  Files           6        6              
  Lines         578      612      +34     
==========================================
+ Hits          566      600      +34     
  Misses         12       12              
Impacted Files Coverage Δ
src/feasibility-form-nls.jl 99.57% <100.00%> (+0.03%) ⬆️
src/feasibility-residual.jl 100.00% <100.00%> (ø)
src/model-interaction.jl 100.00% <100.00%> (ø)
src/quasi-newton.jl 93.33% <100.00%> (ø)
src/slack-model.jl 96.47% <100.00%> (+0.09%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 27ecdc5...21d1f19. Read the comment docs.

@github-actions
Copy link
Contributor

github-actions bot commented Jun 5, 2021

Package name latest stable
ADNLPModels.jl
AmplNLReader.jl
CUTEst.jl
CaNNOLeS.jl
DCI.jl
JSOSolvers.jl
LLSModels.jl
NLPModelsIpopt.jl
NLPModelsJuMP.jl
NLPModelsTest.jl
Percival.jl
QuadraticModels.jl
SolverBenchmark.jl
SolverTools.jl

@github-actions
Copy link
Contributor

github-actions bot commented Jun 5, 2021

Package name latest stable
ADNLPModels.jl
AmplNLReader.jl
CUTEst.jl
CaNNOLeS.jl
DCI.jl
JSOSolvers.jl
LLSModels.jl
NLPModelsIpopt.jl
NLPModelsJuMP.jl
NLPModelsTest.jl
Percival.jl
QuadraticModels.jl
SolverBenchmark.jl
SolverTools.jl

Copy link
Member

@abelsiqueira abelsiqueira left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, that's a lot of work.
Most of my changes are stylistic and trying to minimize the number of changes.
I didn't review the last NLS parts because it's more of the same changes as NLP. After that, I'll review again.

x0 = [meta.x0; zeros(nequ)],
lvar = [meta.lvar; fill(-Inf, nequ)],
uvar = [meta.uvar; fill(Inf, nequ)],
x0 = [meta.x0; zeros(T, nequ)],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this fails for GPU but I don't see a viable alternative, do you?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would this work?

x0= zero(meta.x0, meta.nvar + nequ)
x0[1 : meta.nvar] .= meta.x0

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't test it, but this tutorial suggests that cat and fill are available methods for GPUArrays.

@github-actions
Copy link
Contributor

github-actions bot commented Jun 6, 2021

Package name latest stable
ADNLPModels.jl
AmplNLReader.jl
CUTEst.jl
CaNNOLeS.jl
DCI.jl
JSOSolvers.jl
LLSModels.jl
NLPModelsIpopt.jl
NLPModelsJuMP.jl
NLPModelsTest.jl
Percival.jl
QuadraticModels.jl
SolverBenchmark.jl
SolverTools.jl

@github-actions
Copy link
Contributor

github-actions bot commented Jun 6, 2021

Package name latest stable
ADNLPModels.jl
AmplNLReader.jl
CUTEst.jl
CaNNOLeS.jl
DCI.jl
JSOSolvers.jl
LLSModels.jl
NLPModelsIpopt.jl
NLPModelsJuMP.jl
NLPModelsTest.jl
Percival.jl
QuadraticModels.jl
SolverBenchmark.jl
SolverTools.jl

Copy link
Member

@abelsiqueira abelsiqueira left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. I want to approve but I'm scared of the Probot automerging again.

Copy link
Member

@dpo dpo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM besides my two comments.

@github-actions
Copy link
Contributor

github-actions bot commented Jun 7, 2021

Package name latest stable
ADNLPModels.jl
AmplNLReader.jl
CUTEst.jl
CaNNOLeS.jl
DCI.jl
JSOSolvers.jl
LLSModels.jl
NLPModelsIpopt.jl
NLPModelsJuMP.jl
NLPModelsTest.jl
Percival.jl
QuadraticModels.jl
SolverBenchmark.jl
SolverTools.jl

@tmigot
Copy link
Member Author

tmigot commented Jun 7, 2021

LGTM besides my two comments.

@dpo I updated the constructors. What do you think?

@github-actions
Copy link
Contributor

Package name latest stable
ADNLPModels.jl
AmplNLReader.jl
CUTEst.jl
CaNNOLeS.jl
DCI.jl
JSOSolvers.jl
LLSModels.jl
NLPModelsIpopt.jl
NLPModelsJuMP.jl
NLPModelsTest.jl
Percival.jl
QuadraticModels.jl
SolverBenchmark.jl
SolverTools.jl

@probot-auto-merge probot-auto-merge bot merged commit fa420ee into JuliaSmoothOptimizers:master Jun 11, 2021
@abelsiqueira
Copy link
Member

Thanks. And Probot auto-merged, but it was in accordance with what we discussed yesterday.

@tmigot tmigot deleted the param-meta branch June 11, 2021 13:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants