Skip to content

Conversation

@tmigot
Copy link
Member

@tmigot tmigot commented Jul 7, 2023

only for objective function hprod

@codecov
Copy link

codecov bot commented Jul 7, 2023

Codecov Report

Patch coverage: 97.11% and project coverage change: +0.14 🎉

Comparison is base (3d8e3ea) 95.32% compared to head (a5b2d9d) 95.47%.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #171      +/-   ##
==========================================
+ Coverage   95.32%   95.47%   +0.14%     
==========================================
  Files          14       14              
  Lines        1647     1744      +97     
==========================================
+ Hits         1570     1665      +95     
- Misses         77       79       +2     
Impacted Files Coverage Δ
src/ADNLPModels.jl 100.00% <ø> (ø)
src/predefined_backend.jl 100.00% <ø> (ø)
src/reverse.jl 65.45% <93.61%> (+19.18%) ⬆️
src/sparse_hessian.jl 100.00% <100.00%> (ø)

... and 2 files with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Do you have feedback about the report comment? Let us know in this issue.

@tmigot
Copy link
Member Author

tmigot commented Jul 7, 2023

Connected to #165

When it works, the results are promising https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl/blob/benchmark/benchmark/2023-07-07_adnlpmodels_benchmark_hprod_optimized_nscal_1000_mono.png

For instance, the following fails:

nlp = OptimizationProblems.ADNLPProblems.clnlbeam(n = 1000, hprod_backend = ADNLPModels.ReverseDiffADHvprod);
n = nlp.meta.nvar
v = [sin(T(i) / 10) for i=1:n]
hprod(nlp, get_x0(nlp), v)

This can straightforwardly be adapted to compute the sparse objective Hessian.

@github-actions
Copy link
Contributor

github-actions bot commented Jul 7, 2023

Package name latest stable
CaNNOLeS.jl
DCISolver.jl
DerivativeFreeSolvers.jl
JSOSolvers.jl
NLPModelsIpopt.jl
OptimizationProblems.jl
Percival.jl
QuadraticModels.jl
SolverBenchmark.jl
SolverTools.jl

@tmigot tmigot added the enhancement New feature or request label Jul 12, 2023
@tmigot tmigot mentioned this pull request Jul 14, 2023
@tmigot tmigot force-pushed the reverseforwardhpro branch from cce7f22 to 2afe018 Compare July 17, 2023 16:28
@tmigot
Copy link
Member Author

tmigot commented Jul 17, 2023

@jbcaillau @ocots @amontoison @BaptisteCbl

I added the hprod for the Lagrangian Hessian and optimized sparse hessian.
This is a clear improvement, see the benchmarks:

@tmigot tmigot requested a review from amontoison July 17, 2023 19:04
Copy link
Member

@amontoison amontoison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fantastic Tangi! 🎉

@ocots
Copy link

ocots commented Jul 18, 2023

Test with this file: https://github.com/ocots/JumpVSADNLP/blob/master/bench.jl

Hessian of Lagrangian
ADNLP
  1.191 ms (157 allocations: 141.55 KiB)
JuMP
  60.402 μs (137 allocations: 24.08 KiB)
Jacobian of constraints
ADNLP
  10.509 μs (22 allocations: 20.70 KiB)
JuMP
  13.329 μs (15 allocations: 19.80 KiB)

Seems to rock!

@ocots
Copy link

ocots commented Jul 18, 2023

Great job!

New release please 🙏🙏🙏 (JuliaCon next Tuesday)

@tmigot
Copy link
Member Author

tmigot commented Jul 18, 2023

Let me add some documentation on the new features and improvements, and then the new release (likely tomorrow).

@tmigot tmigot merged commit 470cec6 into main Jul 18, 2023
@tmigot tmigot deleted the reverseforwardhpro branch July 18, 2023 17:18
@jbcaillau
Copy link

Many thanks @tmigot for the timely update. Doing some more tests with @PierreMartinon (Goddard test case).

@dpo
Copy link
Member

dpo commented Jul 18, 2023

Amazing results @tmigot! Congratulations!

@jbcaillau
Copy link

@tmigot @amontoison check this

@amontoison
Copy link
Member

Thanks @jbcaillau! Our computation of the sparse Hessian is 20x / 30x slower than JuMP.
We should be able to improve it by taking into account the symmetry of the Hessian to limit the number of directional derivatives of the gradient.

@jbcaillau
Copy link

jbcaillau commented Jul 27, 2023

@amontoison thanks for the feedback. probably some profiling to do on our side as our objective and constraints evaluations might be far from being efficiently evaluable compared to the pure Julia / JuMP code written for the comparison.

ps. greetings from JuliaCon2023 with @ocots @gergaud ... and @tmigot !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants