Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorrect initialization of derivative storage #97

Closed
andreasnoack opened this issue Feb 4, 2019 · 3 comments · Fixed by #112
Closed

Incorrect initialization of derivative storage #97

andreasnoack opened this issue Feb 4, 2019 · 3 comments · Fixed by #112

Comments

@andreasnoack
Copy link
Contributor

In

function TwiceDifferentiable(f, g, fg, h, x::TX, F::T = real(zero(eltype(x))), G::TG = similar(x), H::TH = alloc_H(x); inplace = true) where {T, TG, TH, TX}
, similar(x) is used for initialization of the gradient array but that need not be the right element type. In a nested derivative computation such as

julia> myf(x,y) = x^2+y^2
myf (generic function with 1 method)

julia> ForwardDiff.gradient(s -> optimize(t -> myf(t[1], s[1]), [0.0], Newton(), autodiff=:forward).minimum, [0.4])
ERROR: MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1})
Closest candidates are:
  Float64(::Real, ::RoundingMode) where T<:AbstractFloat at rounding.jl:194
  Float64(::T<:Number) where T<:Number at boot.jl:741
  Float64(::Int8) at float.jl:60
  ...
Stacktrace:
 [1] convert(::Type{Float64}, ::ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1}) at ./number.jl:7
 [2] setproperty!(::DiffResults.MutableDiffResult{1,Float64,Tuple{Array{Float64,1}}}, ::Symbol, ::ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1}) at ./sysimg.jl:19
 [3] value!(::DiffResults.MutableDiffResult{1,Float64,Tuple{Array{Float64,1}}}, ::ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1}) at /Users/andreasnoack/.julia/packages/DiffResults/1LURj/src/DiffResults.jl:159
 [4] extract_gradient!(::Type{ForwardDiff.Tag{getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}},Float64}}, ::DiffResults.MutableDiffResult{1,Float64,Tuple{Array{Float64,1}}}, ::ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}},Float64},ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}) at /Users/andreasnoack/.julia/packages/ForwardDiff/okZnq/src/gradient.jl:70
 [5] vector_mode_gradient!(::DiffResults.MutableDiffResult{1,Float64,Tuple{Array{Float64,1}}}, ::getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}}, ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}},Float64},Float64,1,Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}},Float64},Float64,1},1}}) at /Users/andreasnoack/.julia/packages/ForwardDiff/okZnq/src/gradient.jl:103
 [6] gradient!(::DiffResults.MutableDiffResult{1,Float64,Tuple{Array{Float64,1}}}, ::getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}}, ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}},Float64},Float64,1,Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}},Float64},Float64,1},1}}, ::Val{true}) at /Users/andreasnoack/.julia/packages/ForwardDiff/okZnq/src/gradient.jl:35
 [7] gradient!(::DiffResults.MutableDiffResult{1,Float64,Tuple{Array{Float64,1}}}, ::getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}}, ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}},Float64},Float64,1,Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}},Float64},Float64,1},1}}) at /Users/andreasnoack/.julia/packages/ForwardDiff/okZnq/src/gradient.jl:33
 [8] (::getfield(NLSolversBase, Symbol("##40#46")){getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}},ForwardDiff.GradientConfig{ForwardDiff.Tag{getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}},Float64},Float64,1,Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##38#40")){Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}},Float64},Float64,1},1}}})(::Array{Float64,1}, ::Array{Float64,1}) at /Users/andreasnoack/.julia/dev/NLSolversBase/src/objective_types/twicedifferentiable.jl:122
 [9] value_gradient!!(::TwiceDifferentiable{Float64,Array{Float64,1},Array{Float64,2},Array{Float64,1}}, ::Array{Float64,1}) at /Users/andreasnoack/.julia/dev/NLSolversBase/src/interface.jl:82
 [10] initial_state(::Newton{LineSearches.InitialStatic{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}}}, ::Optim.Options{Float64,Nothing}, ::TwiceDifferentiable{Float64,Array{Float64,1},Array{Float64,2},Array{Float64,1}}, ::Array{Float64,1}) at /Users/andreasnoack/.julia/dev/Optim/src/multivariate/solvers/second_order/newton.jl:45
 [11] #optimize#87 at /Users/andreasnoack/.julia/dev/Optim/src/multivariate/optimize/optimize.jl:33 [inlined]
 [12] (::getfield(Optim, Symbol("#kw##optimize")))(::NamedTuple{(:autodiff,),Tuple{Symbol}}, ::typeof(optimize), ::Function, ::Array{Float64,1}, ::Newton{LineSearches.InitialStatic{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}}}, ::Optim.Options{Float64,Nothing}) at ./none:0 (repeats 2 times)
 [13] vector_mode_gradient(::getfield(Main, Symbol("##37#39")), ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1,Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}}) at ./none:1
 [14] gradient(::Function, ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1,Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}}, ::Val{true}) at /Users/andreasnoack/.julia/packages/ForwardDiff/okZnq/src/gradient.jl:17
 [15] gradient(::Function, ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1,Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Main, Symbol("##37#39")),Float64},Float64,1},1}}) at /Users/andreasnoack/.julia/packages/ForwardDiff/okZnq/src/gradient.jl:15 (repeats 2 times)
 [16] top-level scope at none:0

the gradient will be a vector of duals which therefore causes an error.

@mfherbst
Copy link

Still failing for me on NLSolversBase v7.6.0, Optim v0.20.0 and ForwardDiff v0.10.9.

@pkofod
Copy link
Member

pkofod commented Jan 27, 2020

Still failing for me on NLSolversBase v7.6.0, Optim v0.20.0 and ForwardDiff v0.10.9.

Sorry, I was not clear here, the "fix" was simply that you can supply the types yourself, but it still defaults to the old behavior. Although, as you showed, it didn't really for complex gradients :)

@mfherbst
Copy link

Ah ... I see. Thanks for the clarification.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants