-
-
Notifications
You must be signed in to change notification settings - Fork 232
Handle constraints inside modelingtoolkitize
of OptimizationSystem
#1983
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Needs tests |
closes #1902 |
end | ||
end | ||
|
||
if (isnothing(prob.lcons) || all(isinf.(prob.lcons))) && (isnothing(prob.ucons) || all(isinf.(prob.ucons))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is this the behaviour for equality contraints? It seems wrong.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why? Btw did you see line 33 as well?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I meant that -inf <= cons <= inf yields cons ~ 0
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see what you are saying, it might be better to no assume that if user is not passing equal constraints they don't intend it to be an equality constraint. I was assuming that case to be equivalent to having a equality to 0, it might not always be true
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think if a user supplies such a pair of (-inf,inf) it is either a modelling error or the intension is to deactivate the constraint. So in any case no constraint should be added. We might even cry. I wonder how JuMP handles such a case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added an error in this case
if !isnothing(prob.lcons) && !isnothing(prob.ucons) | ||
cons = prob.lcons .≲ lhs .≲ prob.ucons | ||
else | ||
if !isnothing(prob.lcons) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this can be refactored, since ucons and lcons are either both nothing or both not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That holds true for ub
and lb
but not for these. We could change it in SciMLBase, but I have left it as it is for correct behavior now and will make the change there subsequently.
if !isnothing(prob.ucons) | ||
for i in 1:num_cons | ||
if !isinf(prob.ucons[i]) | ||
prob.lcons[i] !== prob.ucons[i] ? push!(cons, lhs[i] ~ prob.ucons[i]) : push!(cons, lhs[i] ≲ prob.ucons[i]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't it the other way around? If ucons and lcons are equal, an equality has to be added.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh shit. Yes it's flipped, thanks for catching that, how did this not get caught by the tests though I'll check that
cons = [] | ||
end | ||
|
||
de = OptimizationSystem(eqs, vec(vars), vec(params); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
toparam
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am not sure I understood what you meant here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are generating the parameters via variable(:α, i)
Generate `OptimizationSystem`, dependent variables, and parameters from an `OptimizationProblem`. | ||
""" | ||
function modelingtoolkitize(prob::DiffEqBase.OptimizationProblem; num_cons = 0, kwargs...) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't num_cons
simply isnothing(prob.lcons) ? 0 : length(prob.lcons)
?
de3bde9
to
4f5349d
Compare
Should be ready now |
p = prob.p | ||
end | ||
|
||
vars = reshape([variable(:x, i) for i in eachindex(prob.u0)], size(prob.u0)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be a restructure
. Use the ODE one as a template, i.e. https://github.com/SciML/ModelingToolkit.jl/pull/1983/files#diff-5c721fb74d2a713d8da5efa082f9b6927958cbedf859232a859c252869193196R20-R21
b87e974
to
3c4d820
Compare
test/optimizationsystem.jl
Outdated
generate_hessian(combinedsys) | ||
hess_sparsity = ModelingToolkit.hessian_sparsity(sys1) | ||
sparse_prob = OptimizationProblem(sys1, [x, y], [a, b], grad = true, sparse = true) | ||
sparse_prob = OptimizationProblem(sys1, [x, y], [a, b], grad = true, obj_sparse = true) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what's obj_sparse
and how is it different from sparse
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In this PR I have removed sparse
in favor of obj_sparse
(objective's sparse) and cons_sparse
(constraints' sparse). This is similar to the AutoModelingToolkit
struct. A single kwarg is not sufficient since either could be sparse
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd keep obj_sparse
as sparse
. Consistency matters.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can keep it and deprecate and interpret it as obj_sparse
, but also keep these two for consistency with Optimization.jl?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, not deprecate it. Why would an ODE use obj_sparse
instead of sparse
?
This will enable the
AutoModelingToolkit
in Optimization.jl to switch over to usingmodelingtoolkitize
. SciML/Optimization.jl#432