Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integration with CuArrays: exp vs CUDAnative.exp #20

Closed
dfdx opened this issue Dec 25, 2017 · 1 comment
Closed

Integration with CuArrays: exp vs CUDAnative.exp #20

dfdx opened this issue Dec 25, 2017 · 1 comment

Comments

@dfdx
Copy link
Contributor

dfdx commented Dec 25, 2017

using CuArrays
using NNlib

A = cu(rand(Float32, 10, 100))
B = cu(rand(Float32, 10))
CUDAnative.exp.(NNlib.σ.(A .+ B))

gives:

warning: ignoring debug info with an invalid version (0) in #1
warning: ignoring debug info with an invalid version (0) in 
ERROR: LLVM error: All DICompileUnits must be listed in llvm.dbg.cu

Stacktrace:
 [1] verify(::LLVM.Module) at /home/slipslop/.julia/v0.6/LLVM/src/analysis.jl:11
 [2] #add_entry!#26(::Bool, ::Function, ::LLVM.Module, ::Any, ::Any) at /home/slipslop/.julia/v0.6/CUDAnative/src/jit.jl:251
 [3] (::CUDAnative.#kw##add_entry!)(::Array{Any,1}, ::CUDAnative.#add_entry!, ::LLVM.Module, ::Any, ::Any) at ./<missing>:0
 [4] #compile_function#51(::Bool, ::Function, ::Any, ::Any, ::VersionNumber) at /home/slipslop/.julia/v0.6/CUDAnative/src/jit.jl:402
 [5] cufunction(::CUDAdrv.CuDevice, ::Any, ::Any) at /home/slipslop/.julia/v0.6/CUDAnative/src/jit.jl:465
 [6] macro expansion at /home/slipslop/.julia/v0.6/CUDAnative/src/execution.jl:108 [inlined]
 [7] _cuda(::Tuple{Int64,Int64}, ::Int64, ::CUDAdrv.CuStream, ::CuArrays.#broadcast_kernel, ::##1#2, ::CUDAnative.CuDeviceArray{Float32,2,CUDAnative.AS.Global}, ::Tuple{Tuple{Bool,Bool},Tuple{Bool}}, ::Tuple{Tuple{Int64,Int64},Tuple{Int64}}, ::CUDAnative.CuDeviceArray{Float32,2,CUDAnative.AS.Global}, ::Tuple{CUDAnative.CuDeviceArray{Float32,1,CUDAnative.AS.Global}}) at /home/slipslop/.julia/v0.6/CUDAnative/src/execution.jl:80
 [8] _broadcast! at /home/slipslop/.julia/v0.6/CuArrays/src/broadcast.jl:22 [inlined]
 [9] broadcast_t at /home/slipslop/.julia/v0.6/CuArrays/src/broadcast.jl:37 [inlined]
 [10] broadcast_c at /home/slipslop/.julia/v0.6/CuArrays/src/broadcast.jl:58 [inlined]
 [11] broadcast(::Function, ::CuArray{Float32,2}, ::CuArray{Float32,1}) at ./broadcast.jl:455

Most likely it's caused by difference between Base.exp and CUDAnative.exp: σ uses the first one, but it doesn't work for CuArrays. Normally I resolve it by overloading a function (like this for CUDAnative.log), but in NNlib sigmoid is defined for Float32 and broadcast to any array type it's applied to.

Right now I can't think of any way to define σ (and similar functions) so that they work for both - Array and CuArray, ideas are welcome.


I use Julia 0.6.2, CUDAnative 0.5.3 (the last one for 0.6.x Julia) and latest master of CuArrays.

@MikeInnes
Copy link
Member

Broadcasting NNlib functions should work now. User functions will have to wait for some cassette-like solution.

ToucheSir pushed a commit that referenced this issue Feb 13, 2023
Bump patch and check for failures with Julia 1.7-nightly
ToucheSir pushed a commit that referenced this issue Feb 13, 2023
Bump patch and check for failures with Julia 1.7-nightly
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants