Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ERROR: LoadError: UndefVarError: gpu not defined #246

Closed
swiesend opened this issue Apr 26, 2018 · 5 comments
Closed

ERROR: LoadError: UndefVarError: gpu not defined #246

swiesend opened this issue Apr 26, 2018 · 5 comments

Comments

@swiesend
Copy link

swiesend commented Apr 26, 2018

Do I need Julia 0.7 for Flux 0.5? And how to checkout Flux 0.5 with Julia?

When I try to run a current model-zoo example I get the following error:
$ julia ./mnist/mlp.jl

model-zoo$ julia mnist/mlp.jl 
ERROR: LoadError: UndefVarError: gpu not defined
Stacktrace:
 [1] include_from_node1(::String) at ./loading.jl:576
 [2] include(::String) at ./sysimg.jl:14
 [3] process_options(::Base.JLOptions) at ./client.jl:305
 [4] _start() at ./client.jl:371
while loading /home/sebastian/develop/julia/flux/model-zoo/mnist/mlp.jl, in expression starting on line 10
julia> Pkg.available("Flux")
12-element Array{VersionNumber,1}:
 v"0.1.0"
 v"0.1.1"
 v"0.2.0"
 v"0.2.1"
 v"0.2.2"
 v"0.3.0"
 v"0.3.1"
 v"0.3.2"
 v"0.3.3"
 v"0.3.4"
 v"0.4.0"
 v"0.4.1"
@swiesend
Copy link
Author

swiesend commented Apr 26, 2018

if I checkout v0.5.1 or v0.5.0 manually with git checkout tags/v0.5.1

* (HEAD losgelöst bei v0.5.1)
  master

Then I can't compile Flux, neither on the named versions as on the master

$ julia
               _
   _       _ _(_)_     |  A fresh approach to technical computing
  (_)     | (_) (_)    |  Documentation: https://docs.julialang.org
   _ _   _| |_  __ _   |  Type "?help" for help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 0.6.2 (2017-12-13 18:08 UTC)
 _/ |\__'_|_|_|\__'_|  |  
|__/                   |  x86_64-linux-gnu

julia> Pkg.checkout("Flux")
INFO: Checking out Flux master...
INFO: Pulling Flux latest master...
INFO: No packages to install, update or remove

julia> Pkg.build("Flux")
INFO: Building SpecialFunctions
INFO: Building NNlib

julia> using Flux
INFO: Precompiling module Flux.
ERROR: LoadError: UndefVarError: @fix not defined
Stacktrace:
 [1] include_from_node1(::String) at ./loading.jl:576
 [2] include(::String) at ./sysimg.jl:14
 [3] anonymous at ./<missing>:2
while loading /home/sebastian/.julia/v0.6/Flux/src/Flux.jl, in expression starting on line 16
ERROR: Failed to precompile Flux to /home/sebastian/.julia/lib/v0.6/Flux.ji.
Stacktrace:
 [1] compilecache(::String) at ./loading.jl:710
 [2] _require(::Symbol) at ./loading.jl:497
 [3] require(::Symbol) at ./loading.jl:405

@swiesend
Copy link
Author

swiesend commented Apr 26, 2018

Removing Flux manually (rm -rf F./julia/v0.06/Flux) and checking out the current master helps for the CPU example of model-zoo/mnist/mlp.jl, but not for CuArrays. Should that be a separate issue?

My CUDA version:

$ nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2016 NVIDIA Corporation
Built on Tue_Jan_10_13:22:03_CST_2017
Cuda compilation tools, release 8.0, V8.0.61

CPU:

julia> Pkg.checkout("NNlib")
INFO: Checking out NNlib master...
INFO: Pulling NNlib latest master...
INFO: No packages to install, update or remove

julia> Pkg.checkout("Flux")
INFO: Checking out Flux master...
INFO: Pulling Flux latest master...
WARNING: Cannot perform fast-forward merge.
INFO: Installing ZipFile v0.5.0
julia> include("/home/sebastian/develop/julia/flux/model-zoo/mnist/mlp.jl")
INFO: Recompiling stale cache file /home/sebastian/.julia/lib/v0.6/Flux.ji for module Flux.
INFO: Downloading MNIST dataset
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   469  100   469    0     0    671      0 --:--:-- --:--:-- --:--:--   672
100 9680k  100 9680k    0     0   832k      0  0:00:11  0:00:11 --:--:--  965k
INFO: Downloading MNIST dataset
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   469  100   469    0     0    974      0 --:--:-- --:--:-- --:--:--   973
100 28881  100 28881    0     0  25492      0  0:00:01  0:00:01 --:--:-- 25492
INFO: Downloading MNIST dataset
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   467  100   467    0     0   1013      0 --:--:-- --:--:-- --:--:--  1015
100 1610k  100 1610k    0     0   692k      0  0:00:02  0:00:02 --:--:-- 1221k
INFO: Downloading MNIST dataset
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   467  100   467    0     0    800      0 --:--:-- --:--:-- --:--:--   799
100  4542  100  4542    0     0   4125      0  0:00:01  0:00:01 --:--:--  4125
loss(X, Y) = 2.422949021950707 (tracked)
loss(X, Y) = 1.5699422626022999 (tracked)
loss(X, Y) = 1.0060714383500833 (tracked)
loss(X, Y) = 0.7262447329417614 (tracked)
loss(X, Y) = 0.5770155296024464 (tracked)
loss(X, Y) = 0.5105802405264036 (tracked)
loss(X, Y) = 0.47058301177687256 (tracked)

GPU: with using CuArrays

julia> include("/home/sebastian/develop/julia/flux/model-zoo/mnist/mlp.jl")
INFO: Recompiling stale cache file /home/sebastian/.julia/lib/v0.6/CuArrays.ji for module CuArrays.
WARNING: could not import NNlib.conv2d_grad_x into CUDNN
WARNING: could not import NNlib.conv2d_grad_w into CUDNN
WARNING: could not import NNlib.pool into CUDNN
WARNING: could not import NNlib.pool_grad into CUDNN
ERROR: LoadError: CUDA error: an illegal memory access was encountered (code #700, ERROR_ILLEGAL_ADDRESS)
Stacktrace:
 [1] macro expansion at /home/sebastian/.julia/v0.6/CUDAdrv/src/base.jl:148 [inlined]
 [2] CUDAdrv.CuModule(::String, ::Dict{CUDAdrv.CUjit_option,Any}) at /home/sebastian/.julia/v0.6/CUDAdrv/src/module.jl:35
 [3] cufunction(::CUDAdrv.CuDevice, ::Any, ::Any) at /home/sebastian/.julia/v0.6/CUDAnative/src/jit.jl:488
 [4] macro expansion at /home/sebastian/.julia/v0.6/CUDAnative/src/execution.jl:108 [inlined]
 [5] _cuda(::Tuple{Int64,Int64}, ::Int64, ::CUDAdrv.CuStream, ::CuArrays.#broadcast_kernel, ::Flux.Tracker.##35#36, ::CUDAnative.CuDeviceArray{Float32,2,CUDAnative.AS.Global}, ::Tuple{Tuple{Bool,Bool}}, ::Tuple{Tuple{Int64,Int64}}, ::CUDAnative.CuDeviceArray{ForwardDiff.Dual{Void,Float32,3},2,CUDAnative.AS.Global}, ::Tuple{}) at /home/sebastian/.julia/v0.6/CUDAnative/src/execution.jl:80
 [6] _broadcast! at /home/sebastian/.julia/v0.6/CuArrays/src/broadcast.jl:22 [inlined]
 [7] broadcast_t at /home/sebastian/.julia/v0.6/CuArrays/src/broadcast.jl:37 [inlined]
 [8] broadcast_c at /home/sebastian/.julia/v0.6/CuArrays/src/broadcast.jl:58 [inlined]
 [9] broadcast at ./broadcast.jl:455 [inlined]
 [10] map(::Function, ::CuArray{ForwardDiff.Dual{Void,Float32,3},2}) at /home/sebastian/.julia/v0.6/CuArrays/src/utils.jl:62
 [11] (::Flux.Tracker.Broadcasted{Flux.##72#73{Base.#log},CuArray{ForwardDiff.Dual{Void,Float32,3},2}})() at /home/sebastian/.julia/v0.6/Flux/src/tracker/array.jl:287
 [12] tracked_broadcast(::Function, ::Flux.OneHotMatrix{CuArray{Flux.OneHotVector,1}}, ::TrackedArray{…,CuArray{Float32,2}}, ::Int64) at /home/sebastian/.julia/v0.6/Flux/src/tracker/array.jl:298
 [13] macro expansion at /home/sebastian/.julia/v0.6/NNlib/src/cubroadcast.jl:36 [inlined]
 [14] #crossentropy#71(::Int64, ::Function, ::TrackedArray{…,CuArray{Float32,2}}, ::Flux.OneHotMatrix{CuArray{Flux.OneHotVector,1}}) at /home/sebastian/.julia/v0.6/Flux/src/layers/stateless.jl:8
 [15] crossentropy(::TrackedArray{…,CuArray{Float32,2}}, ::Flux.OneHotMatrix{CuArray{Flux.OneHotVector,1}}) at /home/sebastian/.julia/v0.6/Flux/src/layers/stateless.jl:8
 [16] loss(::CuArray{Float32,2}, ::Flux.OneHotMatrix{CuArray{Flux.OneHotVector,1}}) at /home/sebastian/develop/julia/flux/model-zoo/mnist/mlp.jl:21
 [17] #train!#130(::Flux.#throttled#14, ::Function, ::Function, ::Base.Iterators.Take{Base.Iterators.Repeated{Tuple{CuArray{Float32,2},Flux.OneHotMatrix{CuArray{Flux.OneHotVector,1}}}}}, ::Flux.Optimise.##71#75) at /home/sebastian/.julia/v0.6/Flux/src/optimise/train.jl:39
 [18] (::Flux.Optimise.#kw##train!)(::Array{Any,1}, ::Flux.Optimise.#train!, ::Function, ::Base.Iterators.Take{Base.Iterators.Repeated{Tuple{CuArray{Float32,2},Flux.OneHotMatrix{CuArray{Flux.OneHotVector,1}}}}}, ::Function) at ./<missing>:0
 [19] include_from_node1(::String) at ./loading.jl:576
 [20] include(::String) at ./sysimg.jl:14
while loading /home/sebastian/develop/julia/flux/model-zoo/mnist/mlp.jl, in expression starting on line 29

@gustafsson
Copy link
Contributor

You might have some dependencies that are holding back upgrades? Does Pkg.test("CuArrays") and Pkg.test("Flux") pass? What about if you start from scratch like this?

$ export JULIA_PKGDIR="some_other_temp_path"
$ julia
> Pkg.init()
> Pkg.add("CuArrays")
> Pkg.test("CuArrays")
> Pkg.add("Flux")
> Pkg.test("Flux")

@swiesend
Copy link
Author

I followed your advice to start from scratch and noticed that LLVM was still complaining.

Thus I've rebuild julia v0.6.2 locally and started with a fresh JULIA_PKGDIR.

Is there any possibility to know which package causes to hold back others?

Now the tests are passing, but

julia> Pkg.status("Flux")
 - Flux                          0.5.1
julia> Pkg.status("CuArrays")
 - CuArrays                      0.5.0
julia> Pkg.test("CuArrays")
INFO: Computing test dependencies for CuArrays...
INFO: Installing FFTW v0.0.4
INFO: Building FFTW
INFO: Testing CuArrays
INFO: Testing using device GeForce 940M
INFO: Testing CuArrays/CUDNN
Test Summary: | Pass  Total
CuArrays      |  676    676
INFO: CuArrays tests passed
INFO: Removing FFTW v0.0.4

julia> Pkg.test("Flux")
INFO: Testing Flux
...
INFO: Testing Flux/GPU
INFO: Testing Flux/CUDNN
Test Summary: | Pass  Total
Flux          |  172    172
INFO: Flux tests passed

unfortunately it is still not working:

model-zoo$ git pull
Already up-to-date.
model-zoo$ git branch 
* master
julia> include("/home/sebastian/develop/julia/flux/model-zoo/mnist/mlp.jl")
ERROR: LoadError: Broadcast output type Any is not concrete
Stacktrace:
 [1] broadcast_t at /home/sebastian/.julia/v0.6/CuArrays/src/broadcast.jl:34 [inlined]
 [2] broadcast_c at /home/sebastian/.julia/v0.6/CuArrays/src/broadcast.jl:63 [inlined]
 [3] broadcast at ./broadcast.jl:455 [inlined]
 [4] tracked_broadcast(::Function, ::Flux.OneHotMatrix{CuArray{Flux.OneHotVector,1}}, ::TrackedArray{…,CuArray{Float32,2}}, ::Int64) at /home/sebastian/.julia/v0.6/Flux/src/tracker/array.jl:278
 [5] #crossentropy#71(::Int64, ::Function, ::TrackedArray{…,CuArray{Float32,2}}, ::Flux.OneHotMatrix{CuArray{Flux.OneHotVector,1}}) at /home/sebastian/.julia/v0.6/Flux/src/layers/stateless.jl:8
 [6] crossentropy(::TrackedArray{…,CuArray{Float32,2}}, ::Flux.OneHotMatrix{CuArray{Flux.OneHotVector,1}}) at /home/sebastian/.julia/v0.6/Flux/src/layers/stateless.jl:8
 [7] loss(::CuArray{Float32,2}, ::Flux.OneHotMatrix{CuArray{Flux.OneHotVector,1}}) at /home/sebastian/develop/julia/flux/model-zoo/mnist/mlp.jl:21
 [8] #train!#130(::Flux.#throttled#14, ::Function, ::Function, ::Base.Iterators.Take{Base.Iterators.Repeated{Tuple{CuArray{Float32,2},Flux.OneHotMatrix{CuArray{Flux.OneHotVector,1}}}}}, ::Flux.Optimise.##71#75) at /home/sebastian/.julia/v0.6/Flux/src/optimise/train.jl:39
 [9] (::Flux.Optimise.#kw##train!)(::Array{Any,1}, ::Flux.Optimise.#train!, ::Function, ::Base.Iterators.Take{Base.Iterators.Repeated{Tuple{CuArray{Float32,2},Flux.OneHotMatrix{CuArray{Flux.OneHotVector,1}}}}}, ::Function) at ./<missing>:0
 [10] include_from_node1(::String) at ./loading.jl:576
 [11] include(::String) at ./sysimg.jl:14
while loading /home/sebastian/develop/julia/flux/model-zoo/mnist/mlp.jl, in expression starting on line 29

@swiesend
Copy link
Author

Checking out the current master on Flux helps.

julia> Pkg.checkout("Flux")
INFO: Checking out Flux master...
INFO: Pulling Flux latest master...
WARNING: Cannot perform fast-forward merge.
INFO: No packages to install, update or remove

julia> Pkg.build("Flux")
INFO: Building SpecialFunctions

first added on line model-zoo/mnist/mlp.jl#11 a type check

X = hcat(float.(reshape.(imgs, :))...) |> gpu
info(typeof(X))
julia> include("model-zoo/mnist/mlp.jl")
INFO: Recompiling stale cache file /home/sebastian/.julia/lib/v0.6/Flux.ji for module Flux.
INFO: CuArray{Float32,2}
loss(X, Y) = 2.3530197f0 (tracked)
loss(X, Y) = 0.6614058f0 (tracked)
loss(X, Y) = 0.41540956f0 (tracked)
loss(X, Y) = 0.32339448f0 (tracked)
loss(X, Y) = 0.2818481f0 (tracked)
0.9237

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants