Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sensitivity analysis #468

Closed
mckib2 opened this issue Feb 18, 2021 · 11 comments
Closed

Sensitivity analysis #468

mckib2 opened this issue Feb 18, 2021 · 11 comments

Comments

@mckib2
Copy link
Contributor

mckib2 commented Feb 18, 2021

@jajhall

We are currently working on including sensitivity analysis for HiGHS' results in SciPy (see mckib2/scipy#10 for the working PR).

The plan right now is to mimic the MATLAB style "lambda" result object that has the Lagrange multipliers for the inequality, equality, and bound constraints.  Here's a quick sketch of what I'm doing to get this information (not intended to be running code):

Highs highs;

// load and solve problem...

HighsBasis basis = highs.getBasis();
HighsSolution solution = highs.getSolution();

// slacks in HiGHS appear as Ax - s, not Ax + s, so lambda is negated;
// lambda are the lagrange multipliers associated with Ax=b;
// we can extract multipliers corresponding to inequalities and equalities from here trivially
for (const auto & el : solution.row_dual) {
    lambda.push_back(-1*el);
}

// col_dual are the lagrange multipliers associated with bound conditions;
// we want to separate them into upper and lower bounds
for (auto ii = 0; ii < solution.col_dual.size(); ++ii) {
    if (basis.col_status[ii] == HighsBasisStatus::LOWER) {
        lower.push_back(solution.col_dual[ii]);
        upper.push_back(0);
    }
    else if (basis.col_status[ii] == HighsBasisStatus::UPPER) {
        lower.push_back(0);
        upper.push_back(solution.col_dual[ii]);
    }
    else {
        lower.push_back(0);
        upper.push_back(0);
        // should we be doing anything for other HighsBasisStatuses?
    }
}

The question is if this correctly extracts the Lagrange multipliers (in variables lambda, lower, and upper) and if there is other interesting information that the user might want to have? The thing that makes me nervous is that only UPPER and LOWER HighsBasisStatuses are considered, all the others are essentially thrown away.

Additionally, we'd be curious to know if you have any opinions about what to call this result. MATLAB names the results struct "lambda", but we can't do that in Python because lambda is a reserved keyword. Instead we are calling it "sensitivity". Other places refer to these results as "duals" , "shadow prices", or "Lagrange multipliers", but "sensitivity" seemed like the best choice to us.

@mckib2
Copy link
Contributor Author

mckib2 commented Feb 18, 2021

Pinging @mdhaber so he can follow this query if desired

@odow
Copy link
Collaborator

odow commented Feb 18, 2021

I would call these "dual variables." Another common term is "shadow price."

JuMP does this by looking at the sign of the dual. If minimizing, the dual applies to the lower bound if it is positive, and the upper bound if it is negative. When maximizing, the dual applies to the lower bound if it is negative, and the upper bound if it is positive.

Here's the code, but there is a sense_corrector term because we have a different definition of duality:
https://github.com/jump-dev/HiGHS.jl/blob/cf5ba9e94c656514f85092f815ddef1c896cda07/src/MOI_wrapper.jl#L1703-L1726

p.s. Off-topic: I was just poking the linked scipy PRs. Does scipy have a python implementation of simplex?!?! What was the motivation for that? Why not wrap Clp/GLPK? I guess HiGHS is the replacement?

@mdhaber
Copy link

mdhaber commented Feb 18, 2021

I would call these "dual variables." Another common term is "shadow price."

In SciPy we're planning on calling the attribute of the result object sensitivity, but in the documentation, we explain that this contains the partial derivatives of the objective function with respect to the right-hand side of the constraints, and that other names for this information include "shadow prices", "dual values", and "lagrange multipliers". Out of the many math terms for this information, I suggested sensitivity because I think the colloquial meaning of the word "sensitivity" is more suggestive of its purpose than the other terms.

@mckib2
Copy link
Contributor Author

mckib2 commented Feb 18, 2021

Why not wrap Clp/GLPK?

I don't want to derail the conversation here, but a few thoughts:
A lot of great solvers are not license compliant - in fact most of them aren't! And to a lesser degree, Python has a different distribution model than Julia. Historically it has heavily favored source distributions meaning complied code needed to be highly portable (which is not always the case). The Python implementation was a way to give a license compliant, simple option compared to the existing Python wrapper libraries which largely use a "bring your own solver" approach, offloading the hard work of building binaries to the user. The Python implementation actually works very well for what it is, but of course HiGHS is a very good and welcome upgrade for us

@odow
Copy link
Collaborator

odow commented Feb 18, 2021

The only argument against sensitivity is that it is a term-of-art for the question "how far can the RHS change while the solution remains optimal."

See https://jump.dev/JuMP.jl/stable/solutions/#Sensitivity-analysis-for-LP,

Gurobi uses the same term: https://www.gurobi.com/documentation/9.0/refman/attributes.html
image

It probably makes the most sense for more general users, however.

Thanks for the background. The binary dependency story of Julia used to be a problem for us, but it is now excellent thanks to https://github.com/JuliaPackaging/Yggdrasil. HiGHS has automated builds for every platform: https://github.com/JuliaBinaryWrappers/HiGHS_jll.jl (binaries https://github.com/JuliaBinaryWrappers/HiGHS_jll.jl/releases/tag/HiGHS-v0.2.1%2B0)

@jajhall
Copy link
Sponsor Member

jajhall commented Feb 18, 2021

"Sensitivity" is a reasonable term for what you want, assuming that you only want the "rate of change" data provided by what are variously termed "shadow prices", "dual values", and "Lagrange multipliers". However, as @odow says, there is a bigger picture, since some users want full sensitivity analysis that includes values for "how far can this cost/bound value change and the basis remain optimal". These are provided by what Gurobi calls "sensitivity information". I don't like this term, preferring to call them "range values", as Xpress does
https://www.fico.com/fico-xpress-optimization/docs/latest/mosel/mosel_lang/dhtml/getrange.html
In HiGHS, "ranging information" goes beyond what Gurobi and Xpress offer, with the HighsRangingRecord in src/lp_data/HighsRanging.h providing the index of entering/leaving variables when the cost/bound reaches its limiting value. Providing all this was necessary for a client!

@mckib2
Copy link
Contributor Author

mckib2 commented Feb 18, 2021

@jajhall Excellent - I was vaguely aware of that ranging information and I'll look into including this in our output

As for getting the bounds rate of change information, is @odow 's method of looking at the sign preferred, or is HighsBasisStatus always indicative of the right answer?

@jajhall
Copy link
Sponsor Member

jajhall commented Feb 18, 2021

@jajhall Excellent - I was vaguely aware of that ranging information and I'll look into including this in our output

As for getting the bounds rate of change information, is @odow 's method of looking at the sign preferred, or is HighsBasisStatus always indicative of the right answer?

Indeed, I meant to comment. @odow 's approach to use the sign of the dual to identify whether a (boxed) variable is at its lower or upper bound is unreliable. When (near) dual degenerate, duals will have (near) zero values which, if nonzero and less than the dual feasibility tolerance in magnitude, may be positive or negative for a variable at either bound.

@mdhaber
Copy link

mdhaber commented Feb 18, 2021

Re: ranging information - great! We'll want to include that in sensitivity, too.

@mckib2
Copy link
Contributor Author

mckib2 commented Feb 19, 2021

Okay, I think since all the other statuses deal with basic columns, I think this issue can be closed. Thanks all!

@mckib2 mckib2 closed this as completed Feb 19, 2021
@jajhall
Copy link
Sponsor Member

jajhall commented Feb 21, 2021

Okay, I think since all the other statuses deal with basic columns, I think this issue can be closed. Thanks all!

No HighsBasisStatus::ZERO relates to free variables that are nonbasic. They have primal value of zero and should have zero dual.

Note that I have decided to negate the row duals so that they are mathematically correct, and consistent with Gurobi. They were negated before for historical compatibility. This won't happen until the first formal release. Naturally I will let you know when his happens.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants