You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After upgrading from CUDA@3.2.1 to CUDA#master I'm getting an error about memoization when precompiling. It looks like it's because Oceananigans.jl uses has_cuda in __init__()?
Posted on Julia Slack's #gpu channel where Tim suggested it might be a bug in Memoization.jl.
julia> using LESbrary
[ Info: Precompiling LESbrary [0d21a966-bb61-11e9-3e8e-8da3363c58dd]
ERROR: LoadError: InitError: Evaluation into the closed module `Memoization` breaks incremental compilation because the side effects will not be permanent. This is likely due to some other module mutating `Memoization` with `eval` during precompilation - don't do this.
Stacktrace:
[1] eval
@ ./boot.jl:360 [inlined]
[2] macro expansion
@ ~/.julia/packages/Memoization/AL1Hc/src/Memoization.jl:46 [inlined]
[3] get_cache(default::Type{IdDict}, func::typeof(CUDA.version))
@ Memoization ~/.julia/packages/Memoization/AL1Hc/src/Memoization.jl:39
[4] version
@ ~/.julia/packages/Memoization/AL1Hc/src/Memoization.jl:125 [inlined]
[5] functional(show_reason::Bool)
@ CUDA ~/.julia/packages/CUDA/TojIE/src/initialization.jl:19
[6] has_cuda (repeats 2 times)
@ ~/.julia/packages/CUDA/TojIE/src/initialization.jl:110 [inlined]
[7] __init__()
@ Oceananigans ~/.julia/packages/Oceananigans/IxOwr/src/Oceananigans.jl:232
[8] _include_from_serialized(path::String, depmods::Vector{Any})
@ Base ./loading.jl:674
[9] _require_search_from_serialized(pkg::Base.PkgId, sourcepath::String)
@ Base ./loading.jl:760
[10] _require(pkg::Base.PkgId)
@ Base ./loading.jl:998
[11] require(uuidkey::Base.PkgId)
@ Base ./loading.jl:914
[12] require(into::Module, mod::Symbol)
@ Base ./loading.jl:901
[13] include
@ ./Base.jl:386 [inlined]
[14] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt64}}, source::Nothing)
@ Base ./loading.jl:1213
[15] top-level scope
@ none:1
[16] eval
@ ./boot.jl:360 [inlined]
[17] eval(x::Expr)
@ Base.MainInclude ./client.jl:446
[18] top-level scope
@ none:1
during initialization of module Oceananigans
in expression starting at /home/alir/LESbrary.jl/src/LESbrary.jl:1
ERROR: Failed to precompile LESbrary [0d21a966-bb61-11e9-3e8e-8da3363c58dd] to /home/alir/.julia/compiled/v1.6/LESbrary/jl_gU3gtO.
Stacktrace:
[1] error(s::String)
@ Base ./error.jl:33
[2] compilecache(pkg::Base.PkgId, path::String, internal_stderr::Base.TTY, internal_stdout::Base.TTY)
@ Base ./loading.jl:1360
[3] compilecache(pkg::Base.PkgId, path::String)
@ Base ./loading.jl:1306
[4] _require(pkg::Base.PkgId)
@ Base ./loading.jl:1021
[5] require(uuidkey::Base.PkgId)
@ Base ./loading.jl:914
[6] require(into::Module, mod::Symbol)
@ Base ./loading.jl:901
[7] top-level scope
@ ~/.julia/packages/CUDA/TojIE/src/initialization.jl:55
The text was updated successfully, but these errors were encountered:
After upgrading from
CUDA@3.2.1
toCUDA#master
I'm getting an error about memoization when precompiling. It looks like it's because Oceananigans.jl useshas_cuda
in__init__()
?Posted on Julia Slack's #gpu channel where Tim suggested it might be a bug in Memoization.jl.
The text was updated successfully, but these errors were encountered: