Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Access curr_grad_f, curr_grad_lag_x, curr_grad_lag_s, curr_exact_hessian in Ipopt::TNLP::intermediate_callback #451

Closed
g-braeunlich opened this issue Feb 25, 2021 · 2 comments

Comments

@g-braeunlich
Copy link

g-braeunlich commented Feb 25, 2021

Hello
From the docs
I understand, that I cannot directly use the return values given by the IpoptCalculatedQuantities::curr_* functions.
I first have to use some Resort* functions of an adapter.
What would be the Resort* functions corresponding to curr_grad_* and curr_exact_hessian?

@g-braeunlich g-braeunlich changed the title Access curr_graf_f, curr_grad_lag_x, curr_grad_lag_s, curr_exact_hessian in Ipopt::TNLP::intermediate_callback Access curr_grad_f, curr_grad_lag_x, curr_grad_lag_s, curr_exact_hessian in Ipopt::TNLP::intermediate_callback Feb 25, 2021
@svigerske
Copy link
Member

For gradients, ResortX should apply, but for entries that correspond to fixed variables, you will get the value of the variable instead of 0.
For the Hessian, I don't think that there is any function.

But what happens in ResortX isn't that complicated: So P_x_full_x_ is a permutation matrix that stores the mapping between indices in your TNLP and the NLP that you Ipopt works with internally. Unfortunately, it is private at the moment - fortunately you can just change code. There also seems to be a h_idx_map_ that could be useful.

svigerske added a commit that referenced this issue Apr 14, 2021
@svigerske
Copy link
Member

Ipopt 3.14 has some functions added to TNLP that allow to request the current iterate and various violations during intermediate_callback (or finalize_solution).
This gives access to the absolute values in curr_grad_lag_x.
curr_grad_lag_s should correspond to dual values of the constraints, so they should also be available.

I don't think I want to do anything to return the objective gradient or Hessian back to the user.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants