Skip to content
This repository has been archived by the owner on Apr 27, 2023. It is now read-only.

Error while executing examples #1

Closed
asbisen opened this issue Oct 8, 2020 · 1 comment
Closed

Error while executing examples #1

asbisen opened this issue Oct 8, 2020 · 1 comment

Comments

@asbisen
Copy link

asbisen commented Oct 8, 2020

Executing basic_dense.jl on current master results in error

Julia 1.5.0
Knet 1.4.2

ERROR: LoadError: MethodError: no method matching reluback(::Float64, ::Float32)
Closest candidates are:
  reluback(::AutoGrad.Value{var"##782"}, ::var"##783") where {var"##782", var"##783"} at none:0
  reluback(::var"##782", ::AutoGrad.Value{var"##783"}) where {var"##782", var"##783"} at none:0
  reluback(::T, ::T) where T at /home/abisen/.julia/packages/Knet/rgT4R/src/ops20/activation.jl:27
Stacktrace:
 [1] _broadcast_getindex_evalf at ./broadcast.jl:648 [inlined]
 [2] _broadcast_getindex at ./broadcast.jl:621 [inlined]
 [3] getindex at ./broadcast.jl:575 [inlined]
 [4] copy at ./broadcast.jl:876 [inlined]
 [5] materialize(::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{2},Nothing,typeof(Knet.Ops20.reluback),Tuple{Array{Float64,2},Array{Float32,2}}}) at ./broadcast.jl:837
 [6] back(::typeof(Base.Broadcast.broadcasted), ::Type{AutoGrad.Arg{2}}, ::Array{Float64,2}, ::AutoGrad.Result{Array{Float32,2}}, ::typeof(relu), ::AutoGrad.Result{Array{Float32,2}}) at ./none:0
 [7] differentiate(::Function; o::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at /home/abisen/.julia/packages/AutoGrad/VFrAv/src/core.jl:165
 [8] differentiate at /home/abisen/.julia/packages/AutoGrad/VFrAv/src/core.jl:135 [inlined]
 [9] step!(::Workout, ::Tuple{Array{Float64,2},Array{Float64,2}}; zerograd::Bool) at /home/abisen/.julia/dev/Photon/src/train.jl:233
 [10] step! at /home/abisen/.julia/dev/Photon/src/train.jl:231 [inlined]
 [11] train!(::Workout, ::Base.Iterators.Zip{Tuple{Array{Array{Float64,2},1},Array{Array{Float64,2},1}}}, ::Nothing; epochs::Int64, cb::ConsoleMeter) at /home/abisen/.julia/dev/Photon/src/train.jl:301
 [12] train!(::Workout, ::Base.Iterators.Zip{Tuple{Array{Array{Float64,2},1},Array{Array{Float64,2},1}}}, ::Nothing) at /home/abisen/.julia/dev/Photon/src/train.jl:295 (repeats 2 times)
 [13] top-level scope at /tmp/t.jl:19
 [14] include(::Function, ::Module, ::String) at ./Base.jl:380
 [15] include(::Module, ::String) at ./Base.jl:368
 [16] exec_options(::Base.JLOptions) at ./client.jl:296
 [17] _start() at ./client.jl:506
in expression starting at /tmp/t.jl:19
@jbaron
Copy link

jbaron commented Oct 8, 2020

Thanks for reporting. Noticed the master was one behind from my workstation, so pushed it (and at least the above works correctly on my workstation).

P.S I changed in the last release a lot how the tensor types are determined, so hopefully this is the last defect due to that change. Normally the test suit is running fine, just in case you need more examples.

@jbaron jbaron closed this as completed Nov 27, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants