-
Notifications
You must be signed in to change notification settings - Fork 27
Description
I have an NLP that I am solving through Optimization.jl using either Ipopt or MadNLP as the solver. Both have previously worked. It seems now that I get an error from ForwardDiff about mismatched tags.
I noticed that custom tagging was introduced in 0.6.11. If I pin to 0.6.10, I do not see the issue. I do not have the full stacktrace on this machine, but can type out the top line where the tagged function and args differ. Also, my underlying functions are a mix of custom transcription over top of MTK models, so the type information is incredibly long. I try to group it into ....
The error is here, where
FT = DifferentiationInterface.FixTail{OptimizationBase.var"#lagrangian#28"}{...}, Tuple{Float64, Vector{Float64}, SciMLBase.NullParameters}}
but the provided function f::F is
f::DifferentiationInterface.FixTail{OptimizationBase.var"#lagrangian#28"}{...}, Tuple{Float64, SubArray{Float64, 1, Vector{Float64}, Tuple{UnitRange{Int64}}, true}, SciMLBase.NullParameters}}
where we can see the differing Vector vs SubArray in the Tuples.
The tagged input type, and provided to check_tag, are identical, and begin like:
ForwardDiff.Dual{ForwardDiff.Tag{DifferentiationInterfaceForwardDiffExt.ForwardDiffOverSomethingHVPWrapper{...}}
I apologize for the limited information here. If the issue is obvious, wonderful! If not, I can try to work on extracting a MWE as this issue pops up in a much larger process. For now I will pin to 0.6.10. Thanks!