Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MethodError from using sum(x, 2) #34

Closed
domluna opened this issue Dec 12, 2016 · 5 comments · Fixed by #37
Closed

MethodError from using sum(x, 2) #34

domluna opened this issue Dec 12, 2016 · 5 comments · Fixed by #37

Comments

@domluna
Copy link

domluna commented Dec 12, 2016

p = randn(2,3)
ReverseDiff.@forward f(p) = exp.(p) ./ sum(exp.(p), 2) # softmax
f! = ReverseDiff.compile_gradient(x -> sum(f(x)), similar(p))
f!(similar(p), p)

Gives the following error

ERROR: MethodError: no method matching broadcast_deriv_increment!(::Array{ReverseDiff.TrackedReal{Float64,Float64,Void},2}, ::ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}}, ::Void)
Closest candidates are:
  broadcast_deriv_increment!(::AbstractArray{T,N}, ::Any) at /home/dom/.julia/v0.5/ReverseDiff/src/derivatives/elementwise.jl:632
  broadcast_deriv_increment!(::Any, ::Any, ::Ref{T}) at /home/dom/.julia/v0.5/ReverseDiff/src/derivatives/elementwise.jl:569
  broadcast_deriv_increment!(::AbstractArray{T,N}, ::Any, ::AbstractArray{T,N}) at /home/dom/.julia/v0.5/ReverseDiff/src/derivatives/elementwise.jl:673
  ...
 in special_reverse_exec!(::ReverseDiff.SpecialInstruction{Base.#./,Tuple{ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},Array{ReverseDiff.TrackedReal{Float64,Float64,Void},2}},ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},Tuple{Array{Float64,2},Void}}) at /home/dom/.julia/v0.5/ReverseDiff/src/derivatives/elementwise.jl:465
 in reverse_exec!(::ReverseDiff.SpecialInstruction{Base.#./,Tuple{ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},Array{ReverseDiff.TrackedReal{Float64,Float64,Void},2}},ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},Tuple{Array{Float64,2},Void}}) at /home/dom/.julia/v0.5/ReverseDiff/src/tape.jl:74
 in (::##33#34)() at /home/dom/.julia/v0.5/ReverseDiff/src/api/tape.jl:80
 in seeded_reverse_pass!(::Array{Float64,2}, ::ReverseDiff.TrackedReal{Float64,Float64,Void}, ::ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}}, ::ReverseDiff.Compiled{ReverseDiff.GradientTape{##29#30,ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},ReverseDiff.TrackedReal{Float64,Float64,Void}},##29#30,ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},ReverseDiff.TrackedReal{Float64,Float64,Void},##31#32,##33#34}) at /home/dom/.julia/v0.5/ReverseDiff/src/api/utils.jl:30
 in seeded_reverse_pass! at /home/dom/.julia/v0.5/ReverseDiff/src/api/tape.jl:41 [inlined]
 in gradient!(::Array{Float64,2}, ::ReverseDiff.Compiled{ReverseDiff.GradientTape{##29#30,ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},ReverseDiff.TrackedReal{Float64,Float64,Void}},##29#30,ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},ReverseDiff.TrackedReal{Float64,Float64,Void},##31#32,##33#34}, ::Array{Float64,2}) at /home/dom/.julia/v0.5/ReverseDiff/src/api/gradients.jl:80
 in (::ReverseDiff.##301#302{ReverseDiff.Compiled{ReverseDiff.GradientTape{##29#30,ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},ReverseDiff.TrackedReal{Float64,Float64,Void}},##29#30,ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},ReverseDiff.TrackedReal{Float64,Float64,Void},##31#32,##33#34}})(::Array{Float64,2}, ::Array{Float64,2}) at /home/dom/.julia/v0.5/ReverseDiff/src/api/tape.jl:100

If I do ReverseDiff.@forward f(p) = exp.(p) ./ sum(exp.(p)) there's no error. So, I'm guessing it's something to do with adding the dimension to sum over.

@jrevels
Copy link
Member

jrevels commented Dec 12, 2016

You're attempting to do ReverseDiff.@forward f(p::Array), which is not supported. Only T<:Real arguments are supported - from the ReverseDiff.@forward docs:

  ReverseDiff.@forward(f)(args::Real...)
  ReverseDiff.@forward f(args::Real...) = ...
  ReverseDiff.@forward f = (args::Real...) -> ...

If you remove ReverseDiff.@forward, it should work fine.

There could be a better error message here, and also we should support this in the future.

@domluna
Copy link
Author

domluna commented Dec 12, 2016

Should have mentioned this but even without ReverseDiff.@forward it doesn't work.

ERROR: MethodError: no method matching broadcast_deriv_increment!(::Array{ReverseDiff.TrackedReal{Float64,Float64,Void},2}, ::ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}}, ::Void)
Closest candidates are:
  broadcast_deriv_increment!(::AbstractArray{T,N}, ::Any) at /home/dom/.julia/v0.5/ReverseDiff/src/derivatives/elementwise.jl:632
  broadcast_deriv_increment!(::Any, ::Any, ::Ref{T}) at /home/dom/.julia/v0.5/ReverseDiff/src/derivatives/elementwise.jl:569
  broadcast_deriv_increment!(::AbstractArray{T,N}, ::Any, ::AbstractArray{T,N}) at /home/dom/.julia/v0.5/ReverseDiff/src/derivatives/elementwise.jl:673
  ...
 in special_reverse_exec!(::ReverseDiff.SpecialInstruction{Base.#./,Tuple{ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},Array{ReverseDiff.TrackedReal{Float64,Float64,Void},2}},ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},Tuple{Array{Float64,2},Void}}) at /home/dom/.julia/v0.5/ReverseDiff/src/derivatives/elementwise.jl:465
 in reverse_exec!(::ReverseDiff.SpecialInstruction{Base.#./,Tuple{ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},Array{ReverseDiff.TrackedReal{Float64,Float64,Void},2}},ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},Tuple{Array{Float64,2},Void}}) at /home/dom/.julia/v0.5/ReverseDiff/src/tape.jl:74
 in (::##5#6)() at /home/dom/.julia/v0.5/ReverseDiff/src/api/tape.jl:80
 in seeded_reverse_pass!(::Array{Float64,2}, ::ReverseDiff.TrackedReal{Float64,Float64,Void}, ::ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}}, ::ReverseDiff.Compiled{ReverseDiff.GradientTape{##1#2,ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},ReverseDiff.TrackedReal{Float64,Float64,Void}},##1#2,ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},ReverseDiff.TrackedReal{Float64,Float64,Void},##3#4,##5#6}) at /home/dom/.julia/v0.5/ReverseDiff/src/api/utils.jl:30
 in seeded_reverse_pass! at /home/dom/.julia/v0.5/ReverseDiff/src/api/tape.jl:41 [inlined]
 in gradient!(::Array{Float64,2}, ::ReverseDiff.Compiled{ReverseDiff.GradientTape{##1#2,ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},ReverseDiff.TrackedReal{Float64,Float64,Void}},##1#2,ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},ReverseDiff.TrackedReal{Float64,Float64,Void},##3#4,##5#6}, ::Array{Float64,2}) at /home/dom/.julia/v0.5/ReverseDiff/src/api/gradients.jl:80
 in (::ReverseDiff.##301#302{ReverseDiff.Compiled{ReverseDiff.GradientTape{##1#2,ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},ReverseDiff.TrackedReal{Float64,Float64,Void}},##1#2,ReverseDiff.TrackedArray{Float64,Float64,2,Array{Float64,2},Array{Float64,2}},ReverseDiff.TrackedReal{Float64,Float64,Void},##3#4,##5#6}})(::Array{Float64,2}, ::Array{Float64,2}) at /home/dom/.julia/v0.5/ReverseDiff/src/api/tape.jl:100

Looks like the same error.

@jrevels
Copy link
Member

jrevels commented Dec 12, 2016

That looks like what I fixed in #33 - have you updated to the latest release (v0.0.2)? If I copy and paste your code, but remove the @forward, I get:

julia> using ReverseDiff

julia> begin
           p = randn(2,3)
           f(p) = exp.(p) ./ sum(exp.(p), 2) # softmax
           f! = ReverseDiff.compile_gradient(x -> sum(f(x)), similar(p))
           f!(similar(p), p)
       end
2×3 Array{Float64,2}:
 2.77556e-17  5.55112e-17  5.55112e-17
 0.0          0.0          0.0

@domluna
Copy link
Author

domluna commented Dec 12, 2016

I was on master which is 2 commits ahead it seems. Went to v0.0.2 and it works.

@jrevels
Copy link
Member

jrevels commented Dec 12, 2016

Ah, that's not good. Thanks for letting me know. Looks like I should add softmax to the test suite! I'll reopen this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants