You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running the MNIST example as follows, there is an error. It seems there is an error
ERROR: LoadError: MethodError: Cannot `convert` an object of type Flux.OneHotMatrix{Array{Flux.OneHotVector,1}} to an object of type CLArrays.CLArray
This may have arisen from a call to the constructor CLArrays.CLArray(...),
since type constructors fall back to convert methods.
Stacktrace:
[1] CLArrays.CLArray(::Flux.OneHotMatrix{Array{Flux.OneHotVector,1}}) at ./sysimg.jl:24
[2] include_from_node1(::String) at ./loading.jl:569
[3] include(::String) at ./sysimg.jl:14
[4] process_options(::Base.JLOptions) at ./client.jl:305
[5] _start() at ./client.jl:371
while loading /Users/rveltz/work/prog_gd/julia/flux-mnist-cl.jl, in expression starting on line 20
because the type OneHotMatrix has not been wrapped into CLArrays. Is it an easy fix?
Thank you for your help,
Best regards.
using Flux, MNIST
using Flux: onehotbatch, argmax, mse, throttle
using Base.Iterators: repeated
x, y =traindata()
y =onehotbatch(y, 0:9)
m =Chain(
Dense(28^2, 32, relu),
Dense(32, 10),
softmax)
# using CuArrays# x, y = cu(x), cu(y)# m = mapparams(cu, m)using CLArrays
CLArrays.init(CLArrays.devices()[2])
cl = CLArray
x, y =cl(x), cl(y)
m =mapparams(cl, m)
loss(x, y) =mse(m(x), y)
dataset =repeated((x, y), 200)
evalcb = () ->@show(loss(x, y))
opt =SGD(params(m), 0.1)
Flux.train!(loss, dataset, opt, cb =throttle(evalcb, 5))
# Check the prediction for the first digitargmax(m(x[:,1]), 0:9) ==argmax(y[:,1], 0:9)
The text was updated successfully, but these errors were encountered:
Hi,
When running the MNIST example as follows, there is an error. It seems there is an error
because the type
OneHotMatrix
has not been wrapped into CLArrays. Is it an easy fix?Thank you for your help,
Best regards.
The text was updated successfully, but these errors were encountered: