You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello
From the docs
I understand, that I cannot directly use the return values given by the IpoptCalculatedQuantities::curr_* functions.
I first have to use some Resort* functions of an adapter.
What would be the Resort* functions corresponding to curr_grad_* and curr_exact_hessian?
The text was updated successfully, but these errors were encountered:
g-braeunlich
changed the title
Access curr_graf_f, curr_grad_lag_x, curr_grad_lag_s, curr_exact_hessian in Ipopt::TNLP::intermediate_callback
Access curr_grad_f, curr_grad_lag_x, curr_grad_lag_s, curr_exact_hessian in Ipopt::TNLP::intermediate_callback
Feb 25, 2021
For gradients, ResortX should apply, but for entries that correspond to fixed variables, you will get the value of the variable instead of 0.
For the Hessian, I don't think that there is any function.
But what happens in ResortX isn't that complicated: So P_x_full_x_ is a permutation matrix that stores the mapping between indices in your TNLP and the NLP that you Ipopt works with internally. Unfortunately, it is private at the moment - fortunately you can just change code. There also seems to be a h_idx_map_ that could be useful.
Ipopt 3.14 has some functions added to TNLP that allow to request the current iterate and various violations during intermediate_callback (or finalize_solution).
This gives access to the absolute values in curr_grad_lag_x. curr_grad_lag_s should correspond to dual values of the constraints, so they should also be available.
I don't think I want to do anything to return the objective gradient or Hessian back to the user.
Hello
From the docs
I understand, that I cannot directly use the return values given by the
IpoptCalculatedQuantities::curr_*
functions.I first have to use some
Resort*
functions of an adapter.What would be the
Resort*
functions corresponding tocurr_grad_*
andcurr_exact_hessian
?The text was updated successfully, but these errors were encountered: