Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Type inference problem with binary operators on Julia master #75

Closed
KristofferC opened this issue Dec 6, 2015 · 9 comments
Closed

Type inference problem with binary operators on Julia master #75

KristofferC opened this issue Dec 6, 2015 · 9 comments

Comments

@KristofferC
Copy link
Collaborator

Just a heads up that something is broken in type inference when using Julia master.

Using this simple function:

function foo(x)
    println(eltype(x))
    b = 5.0
    c = x / b
    println(eltype(c))

    d = x * (1/b)
    println(eltype(d))
    return 3 * c
end

gives on 0.4.1 the normal:

julia> ForwardDiff.jacobian(foo, rand(2))
ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}}
ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}}
ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}}
2x2 Array{Float64,2}:
 0.6  0.0
 0.0  0.6

However, on 0.5 this gives:

julia> ForwardDiff.jacobian(foo, rand(2))
ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}}
########
ForwardDiff.GradientNumber{N,T,C} # <------ NOTE
########
ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}}
ERROR: no promotion exists for Int64 and ForwardDiff.GradientNumber{N,T,C}
 [inlined code] from promotion.jl:160
 in .* at arraymath.jl:118
 in foo at none:9
 in _calc_jacobian at /home/kristoffer/.julia/v0.5/ForwardDiff/src/api/jacobian.jl:101
 in jacobian at /home/kristoffer/.julia/v0.5/ForwardDiff/src/api/jacobian.jl:84
 in eval at ./boot.jl:263

It seems that the division x/b cause julia to lose the type inference for the array.

@KristofferC
Copy link
Collaborator Author

Some warntype stuff shows it as well:

Good with *:

julia> @code_warntype *(Array(ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}}),1.0)
Variables:
  A::Array{ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}},0}
  B::Float64

Body:
  begin  # abstractarraymath.jl, line 55:
      return A::Array{ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}},0} .* B::Float64::Array{ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}},0}
  end::Array{ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}},0}

Bad with /:

julia> @code_warntype /(Array(ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}}),1.0)
Variables:
  A::Array{ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}},0}
  B::Float64

Body:
  begin  # abstractarraymath.jl, line 57:
      return A::Array{ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}},0} ./ B::Float64::Array{ForwardDiff.GradientNumber{N,T,C},0}
  end::Array{ForwardDiff.GradientNumber{N,T,C},0}

@KristofferC
Copy link
Collaborator Author

Doing @code_warntype on ./ shows the following instability in one of the lines:

      GenSym(4) = x::ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}} ./ y::Float64::ForwardDiff.GradientNumber{N,T,C}

@KristofferC
Copy link
Collaborator Author

Seems to be a regression in type inference?
Same code: http://imgur.com/vcUHMHC

@KristofferC
Copy link
Collaborator Author

Tracked at JuliaLang/julia#14294

@jrevels
Copy link
Member

jrevels commented Dec 7, 2015

Very nice detective work, thanks!

@KristofferC KristofferC changed the title Type inference problem with division on Julia master Type inference problem with binary operators on Julia master Jan 31, 2016
@KristofferC
Copy link
Collaborator Author

Confirmed fixed on master

julia> Base.return_types(*, (ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}},Float64))
1-element Array{Any,1}:
 ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}}

@mlubin
Copy link
Contributor

mlubin commented Feb 29, 2016

There was no regression test added upstream, could we put one here?

@KristofferC
Copy link
Collaborator Author

I don't have a small repo right now.

@Ken-B
Copy link

Ken-B commented May 8, 2016

It seems there was a regression on Julia master for this, could someone else verify?

julia> Base.return_types(*, (ForwardDiff.GradientNumber{2,Float64,Tuple{Float64,Float64}},Float64))
1-element Array{Any,1}:
 ForwardDiff.GradientNumber{N,T,C}

julia> versioninfo()
Julia Version 0.5.0-dev+3977
Commit 957f1d1 (2016-05-08 00:28 UTC)
Platform Info:
  System: Darwin (x86_64-apple-darwin15.4.0)
  CPU: Intel(R) Core(TM) i7-4980HQ CPU @ 2.80GHz
  WORD_SIZE: 64
  BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
  LAPACK: libopenblas64_
  LIBM: libopenlibm
  LLVM: libLLVM-3.7.1 (ORCJIT, haswell)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants