Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LBFGS approximation of the inverse Hessian #39

Closed
mohamed82008 opened this issue Jun 16, 2021 · 9 comments
Closed

LBFGS approximation of the inverse Hessian #39

mohamed82008 opened this issue Jun 16, 2021 · 9 comments
Assignees
Labels
enhancement New feature or request

Comments

@mohamed82008
Copy link

Hi, thanks for this fantastic package. I have a question. I couldn't see any way to not pass the Hessian in and just rely on an LBFGS approximation of the (inverse) Hessian. Is there a way to do this in MadNLP now? If not, is it too much work to add?

@frapac
Copy link
Collaborator

frapac commented Jun 16, 2021

I am running into a similar issue. I think we could build something directly in the Hessian callback by wrapping the LBFGS operator implemented in JuliaSmoothOptimizers:
https://github.com/JuliaSmoothOptimizers/LinearOperators.jl/blob/master/src/lbfgs.jl
It's not direct to implement, but I think it's doable. In my opinion, the main difficulty is passing the LBFGS matrix to the KKT matrix before solving the linear system.

@mohamed82008
Copy link
Author

I think we should use the Schur complement to solve the system. Assume D is the Hessian below.

image

The linear system M \ b can be solved easily if we have an operator x -> D^-1 * x.

@sshin23
Copy link
Member

sshin23 commented Jun 18, 2021

Thanks for bringing this up @mohamed82008 @frapac. I agree that MadNLP should have a quasi-Newton option. The linear operator implementation for LBFGS in JSO looks great; maybe we can use this.

It seems to me that this may take a good amount of time though. I'll try to implement this over the summer but can't promise anything yet. I'll keep this issue open and update the progress here.

@frapac
Copy link
Collaborator

frapac commented Oct 5, 2022

LBFGS has been implemented in this PR: #221
At the end, we do not depend on JSO for the LBFGS implementation, and use instead the compact representation introduced in:

Byrd, Richard H., Jorge Nocedal, and Robert B. Schnabel. "Representations of quasi-Newton matrices and their use in limited memory methods." Mathematical Programming 63, no. 1 (1994): 129-156.

This is similar (with slight differences) to what is currently implemented in Ipopt.

@frapac
Copy link
Collaborator

frapac commented Mar 27, 2023

Solved by #221

@frapac frapac closed this as completed Mar 27, 2023
@mohamed82008
Copy link
Author

Been admiring this work from a distance. Glad it finally made it in 🎉

@frapac
Copy link
Collaborator

frapac commented Mar 28, 2023

Thank you for your support! We would love seeing MadNLP integrated in NonConvex.jl in the medium term :)

@francis-gagnon
Copy link

francis-gagnon commented Jun 13, 2023

Maybe I'm missing something but, should it means that user-defined function with JuMP + MadNLP should work now ?

I'm still getting the Hessian information is needed. error using JuMP 1.11.1 and MadNLP 0.7.0

Related to : #115

edit : I'm not sure that you still receive notification after the issue is closed so : @frapac @sshin23

Thanks

@frapac
Copy link
Collaborator

frapac commented Apr 10, 2024

@francis-gagnon indeed, I haven't received your comment before. MadNLP is currently not supporting user-defined operators inside MOI. This is resolved with this new PR: #322

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants