Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add VectorNonlinearFunction #2201

Merged
merged 29 commits into from Aug 14, 2023
Merged

Add VectorNonlinearFunction #2201

merged 29 commits into from Aug 14, 2023

Conversation

odow
Copy link
Member

@odow odow commented Jun 9, 2023

Resurrecting VectorNonlinearFunction from the depths of #2059.

There are two main motivating problems for this:

  1. VectorNonlinearFunction-in-Complements constrains for nonlinear complementarity problems
  2. ObjectiveFunction{VectorNonlinearFunction} for nonlinear multi-objective problems

Documentation

https://jump.dev/MathOptInterface.jl/previews/PR2201/reference/standard_form/#MathOptInterface.VectorNonlinearFunction

I also removed this from #2059 because there we a lot of issues with the bridges, but I have a better handle on what to do now, and we can easily run solver-tests to catch the culprits.

https://github.com/jump-dev/MathOptInterface.jl/actions/runs/5218217869

@odow
Copy link
Member Author

odow commented Jun 12, 2023

@odow
Copy link
Member Author

odow commented Jun 12, 2023

Marking this as blocked by #2203

@blegat
Copy link
Member

blegat commented Jun 13, 2023

One thing to consider:

struct VectorFunction{F} <: MOI.AbstractVectorFunction
    elements::Vector{F}
end
const VectorOfVariables = VectorFunction{VariableIndex}
const VectorNonlinearFunction = VectorFunction{Any}

It could simplify because we then don't have to implement operate!, etc... for both. It might be a bit more complicated since it's more abstract but I would expect it to go smoothly since we have the promote_operation and operate API to work abstractly on F.
Another advantage is that we have another scalar function defined in https://github.com/jump-dev/PolyJuMP.jl/blob/master/src/functions.jl.
And that would allow us to do

const VectorPolynomialFunction = Vector{ScalarPolynomialFunction}

and get a working vector polynomial function in one line !

One might indeed wonder why we need to define our own vector type when Julia already has Base.Vector.
It makes sense for VectorAffineFunction since it is implementing it as a matrix and not a vector of vectors but for the other one, it's good to have a default vector function.

One could even define an AbstractFunctionConversionBridge from VectorFunction{ScalarAffineFunction} to VectorAffineFunction even if we probably still want to create a VectorAffineFunction from JuMP directly, even if the JuMP functions are really VectorFunction{ScalarAffineFunction} !

@odow
Copy link
Member Author

odow commented Jun 13, 2023

This is an interesting suggestion. We'd need backward compat for (::VectorOfVariables).variables, but a generic AbstractVectorFunction that is a vector of scalar terms would be useful.

@odow
Copy link
Member Author

odow commented Jun 15, 2023

This is not fun. I think I'm going to revert the GenericVectorFunction changes to MOI.VectorOfVariables. There are too many little issues that crop up. Perhaps we can revisit once this PR is merged, and once we have a solution for the operate mess.

@blegat
Copy link
Member

blegat commented Jun 15, 2023

There are too many little issues that crop up.

The tests seem to pass except for some printing issues though

@odow
Copy link
Member Author

odow commented Jun 15, 2023

The tests seem to pass except for some printing issues though

The tests don't cover the code enough. If you try running the tests with Clp/SCS etc lots of things break.

@odow
Copy link
Member Author

odow commented Jun 21, 2023

This is still blocked by the operate issue, but we're getting closer: #2215

@odow odow force-pushed the od/vnf branch 2 times, most recently from efecede to 233a93b Compare June 22, 2023 00:35
@odow
Copy link
Member Author

odow commented Jun 22, 2023

@odow
Copy link
Member Author

odow commented Jun 22, 2023

If you try running the tests with SCS etc lots of things break.

Okay. One massive source of confusion for me had been failures with SCS. Which were because the bridge in SCS was wrong: jump-dev/SCS.jl#277.

It used to do this:

julia> MOI.Bridges.print_active_bridges(model, MOI.VectorNonlinearFunction, MOI.PositiveSemidefiniteConeTriangle)
 * Unsupported constraint: MOI.GenericVectorFunction{Any}-in-MOI.PositiveSemidefiniteConeTriangle
 |  bridged by:
 |   SCS.ScaledPSDConeBridge{Float64, G} where G
 |  may introduce:
 |   * Supported constraint: MOI.VectorAffineFunction{Float64}-in-SCS.ScaledPSDCone

So no wonder lots of weird unrelated things broke.

But I only dug deep enough once I figured out it wasn't a method error with the various operate methods.

@odow
Copy link
Member Author

odow commented Jun 23, 2023

So I was looking into VectorNonlinearFunction-in-PositiveSemidefiniteConeTriangle etc. And there are a number of thorny issues:

  • Utilities.eachscalar assumes that each row of the vector function is scalar_type(F). But the row of a VectorNonlinearFunction might not be ScalarNonlinearFunction.
  • Easy fix is to convert every row, but then what's the difference between Utilities.scalarize and Utilities.eachscalar?
  • Operations don't simplify. So f = sin(x) and then -(f, f) doesn't simplify to 0. So we get an expression swelling problem that prevents Utilities.isapprox_zero from working. The main problem with this is in the Bridges.Constraint.SquareBridge, which checks if off-diagonals are equivalent by checking if their difference is zero.

@blegat
Copy link
Member

blegat commented Jun 23, 2023

Easy fix is to convert every row, but then what's the difference between Utilities.scalarize and Utilities.eachscalar?

Yes, I think every row should be an SNF. The difference of scalarize is that it allocates the vector of functions while eachscalar does not. If you have a lot or rows but you're only going to get one, eachscalar will be faster.
In fact, scalarize is really like conversion to a GenericVectorFunction

The main problem with this is in the Bridges.Constraint.SquareBridge, which checks if off-diagonals are equivalent by checking if their difference is zero.

Maybe it should use isequal_canonical or something like that

@odow
Copy link
Member Author

odow commented Jun 26, 2023

Should scalarize be a copy? Currently it isn't always:

"""
scalarize(func::MOI.VectorOfVariables, ignore_constants::Bool = false)
Returns a vector of scalar functions making up the vector function in the form
of a `Vector{MOI.SingleVariable}`.
See also [`eachscalar`](@ref).
"""
function scalarize(f::MOI.VectorOfVariables, ignore_constants::Bool = false)
return f.variables
end

Ditto with vectorize

"""
vectorize(x::AbstractVector{<:Number})
Returns `x`.
"""
vectorize(x::AbstractVector{<:Number}) = x
"""
vectorize(x::AbstractVector{MOI.VariableIndex})
Returns the vector of scalar affine functions in the form of a
`MOI.VectorAffineFunction{T}`.
"""
vectorize(x::AbstractVector{MOI.VariableIndex}) = MOI.VectorOfVariables(x)

@blegat
Copy link
Member

blegat commented Jun 26, 2023

I don't think it has to be a copy. MA.operate(scalarize, f) should be a copy 😆

@odow
Copy link
Member Author

odow commented Jun 26, 2023

@odow odow changed the title WIP: Add VectorNonlinearFunction Add VectorNonlinearFunction Jun 26, 2023
@odow
Copy link
Member Author

odow commented Jun 26, 2023

PATHSolver.jl and MultiObjectiveAlgorithms.jl are both passing tests with this branch, so I don't think we're far away. Just need https://github.com/jump-dev/MathOptInterface.jl/actions/runs/5383997349 to finish.

@mlubin
Copy link
Member

mlubin commented Aug 11, 2023

I'll take a look over the weekend

src/Test/test_multiobjective.jl Outdated Show resolved Hide resolved
src/Test/test_multiobjective.jl Outdated Show resolved Hide resolved
src/Test/test_multiobjective.jl Outdated Show resolved Hide resolved
src/Test/test_multiobjective.jl Outdated Show resolved Hide resolved
Co-authored-by: Miles Lubin <mlubin@google.com>
@odow odow merged commit 92f73ae into master Aug 14, 2023
11 of 12 checks passed
@odow odow deleted the od/vnf branch August 14, 2023 00:06
@odow odow added the Project: next-gen nonlinear support Issues relating to nonlinear support label Sep 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Project: next-gen nonlinear support Issues relating to nonlinear support
Development

Successfully merging this pull request may close these issues.

None yet

3 participants