-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Expose presolve code #68
Comments
I think it makes sense to eventually break things off into smaller packages, especially if several projects use it. At this point, Tulip's presolve is almost self-contained in My current belief is that a "stand-alone" presolve package should be able to operate as follows:
Pointer for some inspiration: COIN-OR's Some questions to fuel the discussion:
cc @frapac |
We only care about MILP. For our purposes, we don't necessarily want to be tied to a particular solver, and will eventually pipe the model through MOI anyway (to solve and/or bridge). So, working at the MOI level is probably best for us. Is there anything about your current approach that will not map nicely to MOI? We will also potentially want to disable certain presolve routines but not others (e.g. in order to keep problem dimensions the same). So baking that into the API would be very useful. |
Internally, the presolve code works with an LP representation
where To interface with MOI, you need to do the conversion MOI -> LP and then back LP -> MOI. |
I would be also interested in having a presolve package for https://github.com/exanauts/Simplex.jl |
It seems reasonable to me to: keep the internal representation the same, and then add an MOI and/or MatrixOptInterface interface on top as-needed. Based on my brief look, I don't see any reason why you could not directly add the MIP information to your internal data structures. Generally, I'm not all that concerned about the indirection of going through MOI, so I'm fine making tying the API to MOI if others are as well. Ideally, there would also be a programmatic way to configure the |
(By the way, we have some cycles to spend on making this happen, once we converge on a plan). |
I suggest we take the discussion over to... 🚧 MathOptPresolve.jl 🚧 |
I'm also very interested in building upon Tulip's presolve but I don't use MOI. It would be great to simply pass an LP or QP in "matrix form" and receive a presolved problem in the same format. It seems there could be a low-level API with an MOI layer on top. |
This issue has been stale for 2 years; closing. For reference:
|
@mtanneau mentioned at the last JuMP developers call that he was considering making the presolve code here in Tulip available for use in other packages. I'm starting work on a prototype solver for which this code would be useful. I'll eventually want to build out some MIP presolve routines on top.
What is your preferred way to do this? Would you prefer we depend on Tulip? Or would you consider splitting the code off into a separate package?
cc @BochuanBob and @Anhtu07
The text was updated successfully, but these errors were encountered: