Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix termination status when Highs_run warns #108

Merged
merged 1 commit into from
Apr 4, 2022
Merged

Conversation

odow
Copy link
Member

@odow odow commented Apr 3, 2022

Closes #105

@grahamgill do you have an example of a model that generates a warning in Highs_run so we can test this?

@codecov
Copy link

codecov bot commented Apr 3, 2022

Codecov Report

Merging #108 (6e6c967) into master (8e39503) will not change coverage.
The diff coverage is 100.00%.

@@           Coverage Diff           @@
##           master     #108   +/-   ##
=======================================
  Coverage   82.96%   82.96%           
=======================================
  Files           3        3           
  Lines        1303     1303           
=======================================
  Hits         1081     1081           
  Misses        222      222           
Impacted Files Coverage Δ
src/MOI_wrapper.jl 93.05% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 8e39503...6e6c967. Read the comment docs.

@odow odow merged commit 2a6af65 into master Apr 4, 2022
@odow odow deleted the od/optimize-error branch April 4, 2022 20:56
@odow
Copy link
Member Author

odow commented Apr 4, 2022

Merging since it seems like a strict improvement for now. @grahamgill if you have an example that warns, please let me know.

@grahamgill
Copy link

@odow here's an MPS that gives a Highs_run warning return value (kHighsStatusWarning) and produces a kHighsModelStatusUnknown model status. (.TXT extension added to keep github file attachment happy.)
LPmodel.mps.TXT

Question: To produce this example, I tried adding these attributes to HiGHS.Optimizer:

MOI.set(solver, MOI.RawOptimizerAttribute("write_model_file"), "/home/graham/LPmodel.mps")
MOI.set(solver, MOI.RawOptimizerAttribute("write_model_to_file"), true)

but they seem not to do anything when I run the HiGHS solver using either Convex or JuMP. However, when I use JuMP's

write_to_file(model, "/home/graham/LPmodel.mps")

the MPS file is written out. Is there something else I need to do to write the MPS file using MOI.set(solver, MOI.RawOptimizerAttribute(... as above? (E.g. if I don't want to recreate my problem in JuMP just so that I can write out the MPS file.)

@odow
Copy link
Member Author

odow commented Apr 5, 2022

This is an interesting example, can I share it with the HiGHS team?

julia> using JuMP, HiGHS

julia> model = read_from_file("/Users/Oscar/Downloads/LPModel.mps.TXT")
A JuMP Model
Maximization problem with:
Variables: 64
Objective function type: AffExpr
`AffExpr`-in-`MathOptInterface.EqualTo{Float64}`: 66 constraints
`AffExpr`-in-`MathOptInterface.GreaterThan{Float64}`: 48 constraints
`AffExpr`-in-`MathOptInterface.LessThan{Float64}`: 14 constraints
`VariableRef`-in-`MathOptInterface.GreaterThan{Float64}`: 64 constraints
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.

julia> set_optimizer(model, HiGHS.Optimizer)

julia> optimize!(model)
Presolving model
2 rows, 28 cols, 56 nonzeros
1 rows, 22 cols, 22 nonzeros
1 rows, 22 cols, 22 nonzeros
Presolve : Reductions: rows 1(-127); columns 22(-42); elements 22(-232)
Solving the presolved LP
Using EKK dual simplex solver - serial
  Iteration        Objective     Infeasibilities num(sum)
          0    -2.4703596357e+00 Pr: 1(12577.5) 0s
         16    -3.7668335258e+00 Pr: 0(0); Du: 1(1.43347e-07) 0s
         16    -3.7668335258e+00 Pr: 0(0); Du: 1(1.43347e-07) 0s
Model   status      : Unknown
Simplex   iterations: 16
Objective value     : -3.7668335258e+00
HiGHS run time      :          0.00

julia> solution_summary(model)
* Solver : HiGHS

* Status
  Termination status : OTHER_ERROR
  Primal status      : FEASIBLE_POINT
  Dual status        : INFEASIBLE_POINT
  Message from the solver:
  "kHighsModelStatusUnknown"

* Candidate solution
  Objective value      : -3.76683e+00
  Objective bound      : -0.00000e+00
  Dual objective value : 3.10950e-03

* Work counters
  Solve time (sec)   : 3.20404e-03
  Simplex iterations : 16
  Barrier iterations : 0

if I don't want to recreate my problem in JuMP just so that I can write out the MPS file

Are you building it in HiGHS directly? If so

using HiGHS
solver = HiGHS.Optimizer()
# ... build model
Highs_writeModel(solver, "/home/graham/LPmodel.mps")

Or if you have a MOI object:

MOI.write_to_file(solver, "LPmodel.mps")

@grahamgill
Copy link

@odow sure, please share the example. I have a few others that produce the same solver status results.

In my process, I solve an LP problem first in order to find a scaling factor that will be used for one term in the objective function of a MILP. Because I'm doing this for hundreds of different problems, it has to be automated, but the scaling factor produced by solving the LP doesn't need to be optimal, just "good enough". Hence my interest in distinguishing a HiGHS solver warning status from an error status, since with a warning status the scaling factor may be good enough, and I can quickly review the cases that produced warnings and see what the model status was so that I can limit the number of cases that I will look at in more detail.

I was building the model in Convex originally. I assumed that setting

MOI.set(solver, MOI.RawOptimizerAttribute("write_model_file"), "/home/graham/LPmodel.mps")
MOI.set(solver, MOI.RawOptimizerAttribute("write_model_to_file"), true)

would cause the MPS file to be written when I ran Convex.solve!; or in JuMP when I ran JuMP.optimize!, but that doesn't seem to be the case. Thanks for pointing out MOI.write_to_file.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

_OPTIMIZE_WARNED status?
2 participants