Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor GenericNonlinearExpr and NonlinearOperator constructors #3489

Merged
merged 7 commits into from Sep 8, 2023

Conversation

odow
Copy link
Member

@odow odow commented Sep 7, 2023

Alternative for #3451

If we get rid of the "let's check if we can infer the variable type" then a lot of things become much easier.

src/nlp_expr.jl Outdated Show resolved Hide resolved
src/nlp_expr.jl Outdated Show resolved Hide resolved
src/nlp_expr.jl Outdated Show resolved Hide resolved
@codecov
Copy link

codecov bot commented Sep 7, 2023

Codecov Report

Patch coverage: 100.00% and project coverage change: -0.01% ⚠️

Comparison is base (a157126) 98.11% compared to head (cdecbb3) 98.11%.
Report is 1 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #3489      +/-   ##
==========================================
- Coverage   98.11%   98.11%   -0.01%     
==========================================
  Files          37       37              
  Lines        5531     5529       -2     
==========================================
- Hits         5427     5425       -2     
  Misses        104      104              
Files Changed Coverage Δ
src/nlp_expr.jl 99.51% <100.00%> (-0.01%) ⬇️
src/operators.jl 96.83% <100.00%> (-0.02%) ⬇️

☔ View full report in Codecov by Sentry.

📢 Have feedback on the report? Share it here.

test/test_nlp_expr.jl Outdated Show resolved Hide resolved
src/nlp_expr.jl Outdated Show resolved Hide resolved
Copy link
Member

@blegat blegat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this

src/nlp_expr.jl Outdated Show resolved Hide resolved
@odow
Copy link
Member Author

odow commented Sep 7, 2023

For future reference, here were the functions I removed

function variable_ref_type(
    ::Type{GenericNonlinearExpr},
    ::AbstractArray{T},
) where {T<:AbstractJuMPScalar}
    return variable_ref_type(T)
end

moi_function(x::AbstractArray) = moi_function.(x)

jump_function(model::GenericModel, x::AbstractArray) = jump_function.(model, x)

function test_nonlinear_operator_vector_args()
    model = Model()
    @variable(model, x[1:2, 1:2])
    op_det = NonlinearOperator(LinearAlgebra.det, :det)
    @inferred log(op_det(x))
    z = [2.0 1.0; 1.0 2.0]
    @inferred log(op_det(z))
    @objective(model, Min, log(op_det(x)))
    @test isequal_canonical(objective_function(model), log(op_det(x)))
    f = MOI.get(model, MOI.ObjectiveFunction{MOI.ScalarNonlinearFunction}())
    g = MOI.ScalarNonlinearFunction(:det, Any[index.(x)])
    h = MOI.ScalarNonlinearFunction(:log, Any[g])
    @test f  h
    @test moi_function(log(op_det(x)))  h
    return
end

Comment on lines -84 to -92
function GenericNonlinearExpr(head::Symbol, args::Vector{Any})
index = findfirst(Base.Fix2(isa, AbstractJuMPScalar), args)
if index === nothing
error(
"Unable to create a nonlinear expression because it did not " *
"contain any JuMP scalars. head = $head, args = $args.",
)
end
return new{variable_ref_type(args[index])}(head, args)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I understand the motivation for this. Unfortunately, it will break InfiniteOpt, but I can look into refactoring the workflow. If supporting array arguments necessitates this change, then I support it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can probably just keep this method then. It's not necessary. Let me make some changes.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Try this

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I confirmed that this works with InfiniteOpt: infiniteopt/InfiniteOpt.jl#321

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, you beat me to it :)

@odow odow changed the title Add support for Array arguments in NonlinearOperator Refactor GenericNonlinearExpr and NonlinearOperator constructors Sep 8, 2023
@odow odow merged commit 0df25a9 into master Sep 8, 2023
12 checks passed
@odow odow deleted the od/vector-valued-args branch September 8, 2023 03:37
@odow odow mentioned this pull request Sep 8, 2023
25 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants