-
Notifications
You must be signed in to change notification settings - Fork 230
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
warnings in running examples/optimizers.jl #51
Comments
Can you indicate the language, package, and library versions?
versioninfo() and Pkg.status() in Julia and the versions for cuda and cudnn.
…On Thu, Dec 15, 2016, 15:00 ngphuoc ***@***.***> wrote:
I tried to run optimizers.jl and got a lot of the following warning:
% julia optimizers.jl
in #conv4x#70(::Ptr{Void}, ::Float32, ::Float32, ::Int64, ::Ptr{Void}, ::Int64, ::Array{Any,1}, ::Function, ::Knet.KnetArray{Float32,4}, ::Knet.KnetArray{Float32,4}, ::Knet.KnetArray{Float32,4}) at /home/phuoc/.julia/v0.5/Knet/src/cuda44.jl:50
in #conv4x#85(::Array{Any,1}, ::Function, ::AutoGrad.Rec{Knet.KnetArray{Float32,4}}, ::AutoGrad.Rec{Knet.KnetArray{Float32,4}}, ::Knet.KnetArray{Float32,4}) at ./<missing>:0
in #conv4#74(::Array{Any,1}, ::Function, ::Type{AutoGrad.Grad{2}}, ::Knet.KnetArray{Float32,4}, ::Knet.KnetArray{Float32,4}, ::AutoGrad.Rec{Knet.KnetArray{Float32,4}}, ::AutoGrad.Rec{Knet.KnetArray{Float32,4}}) at ./<missing>:0
in conv4(::Type{AutoGrad.Grad{2}}, ::Knet.KnetArray{Float32,4}, ::Knet.KnetArray{Float32,4}, ::AutoGrad.Rec{Knet.KnetArray{Float32,4}}, ::AutoGrad.Rec{Knet.KnetArray{Float32,4}}) at ./<missing>:0
in backward_pass(::AutoGrad.Rec{Array{Any,1}}, ::AutoGrad.Rec{Float32}, ::Array{AutoGrad.Node,1}) at /home/phuoc/.julia/v0.5/AutoGrad/src/core.jl:212
in (::AutoGrad.##gradfun#1#3{Optimizers.#loss,Int64})(::Array{Any,1}, ::Function, ::Array{Any,1}, ::Vararg{Any,N}) at /home/phuoc/.julia/v0.5/AutoGrad/src/core.jl:47
in (::AutoGrad.#gradfun#2)(::Array{Any,1}, ::Vararg{Any,N}) at /home/phuoc/.julia/v0.5/AutoGrad/src/core.jl:47
in #train#4(::Float64, ::Int64, ::Int64, ::Function, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}) at /home/phuoc/.julia/v0.5/Knet/examples/optimizers.jl:68
in (::Optimizers.#kw##train)(::Array{Any,1}, ::Optimizers.#train, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}) at ./<missing>:0
in macro expansion at /home/phuoc/.julia/v0.5/Knet/examples/optimizers.jl:57 [inlined]
in macro expansion at ./util.jl:184 [inlined]
in main(::Array{String,1}) at /home/phuoc/.julia/v0.5/Knet/examples/optimizers.jl:56
in include_from_node1(::String) at ./loading.jl:488
in process_options(::Base.JLOptions) at ./client.jl:262
in _start() at ./client.jl:318WARNING: cudnn.cudnnConvolutionBackwardData error 3
in #conv4x#70(::Ptr{Void}, ::Float32, ::Float32, ::Int64, ::Ptr{Void}, ::Int64, ::Array{Any,1}, ::Function, ::Knet.KnetArray{Float32,4}, ::Knet.KnetArray{Float32,4}, ::Knet.KnetArray{Float32,4}) at /home/phuoc/.julia/v0.5/Knet/src/cuda44.jl:50
in #conv4x#85(::Array{Any,1}, ::Function, ::AutoGrad.Rec{Knet.KnetArray{Float32,4}}, ::AutoGrad.Rec{Knet.KnetArray{Float32,4}}, ::Knet.KnetArray{Float32,4}) at ./<missing>:0
in #conv4#74(::Array{Any,1}, ::Function, ::Type{AutoGrad.Grad{2}}, ::Knet.KnetArray{Float32,4}, ::Knet.KnetArray{Float32,4}, ::AutoGrad.Rec{Knet.KnetArray{Float32,4}}, ::AutoGrad.Rec{Knet.KnetArray{Float32,4}}) at ./<missing>:0
in conv4(::Type{AutoGrad.Grad{2}}, ::Knet.KnetArray{Float32,4}, ::Knet.KnetArray{Float32,4}, ::AutoGrad.Rec{Knet.KnetArray{Float32,4}}, ::AutoGrad.Rec{Knet.KnetArray{Float32,4}}) at ./<missing>:0
in backward_pass(::AutoGrad.Rec{Array{Any,1}}, ::AutoGrad.Rec{Float32}, ::Array{AutoGrad.Node,1}) at /home/phuoc/.julia/v0.5/AutoGrad/src/core.jl:212
in (::AutoGrad.##gradfun#1#3{Optimizers.#loss,Int64})(::Array{Any,1}, ::Function, ::Array{Any,1}, ::Vararg{Any,N}) at /home/phuoc/.julia/v0.5/AutoGrad/src/core.jl:47
in (::AutoGrad.#gradfun#2)(::Array{Any,1}, ::Vararg{Any,N}) at /home/phuoc/.julia/v0.5/AutoGrad/src/core.jl:47
in #train#4(::Float64, ::Int64, ::Int64, ::Function, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}) at /home/phuoc/.julia/v0.5/Knet/examples/optimizers.jl:68
in (::Optimizers.#kw##train)(::Array{Any,1}, ::Optimizers.#train, ::Array{Any,1}, ::Array{Any,1}, ::Array{Any,1}) at ./<missing>:0
in macro expansion at /home/phuoc/.julia/v0.5/Knet/examples/optimizers.jl:57 [inlined]
in macro expansion at ./util.jl:184 [inlined]
in main(::Array{String,1}) at /home/phuoc/.julia/v0.5/Knet/examples/optimizers.jl:56
in include_from_node1(::String) at ./loading.jl:488
in process_options(::Base.JLOptions) at ./client.jl:262
in _start() at ./client.jl:318. 77.394763 seconds (26.57 M allocations: 1.085 GB, 1.71% gc time)
...
It finally ran but the result seems wrong:
(:epoch,0,:trn,(0.091516666f0,2.3076072f0),:tst,(0.086f0,2.3085766f0))
(:epoch,1,:trn,(0.11236667f0,2.3013859f0),:tst,(0.1135f0,2.3013237f0))
(:epoch,2,:trn,(0.11236667f0,2.301366f0),:tst,(0.1135f0,2.3012974f0))
(:epoch,3,:trn,(0.11236667f0,2.301365f0),:tst,(0.1135f0,2.301296f0))
(:epoch,4,:trn,(0.11236667f0,2.301363f0),:tst,(0.1135f0,2.301292f0))
(:epoch,5,:trn,(0.11236667f0,2.301363f0),:tst,(0.1135f0,2.3012917f0))
(:epoch,6,:trn,(0.11236667f0,2.301362f0),:tst,(0.1135f0,2.3012917f0))
(:epoch,7,:trn,(0.11236667f0,2.3013616f0),:tst,(0.1135f0,2.301291f0))
(:epoch,8,:trn,(0.11236667f0,2.301362f0),:tst,(0.1135f0,2.301291f0))
(:epoch,9,:trn,(0.11236667f0,2.3013618f0),:tst,(0.1135f0,2.3012905f0))
(:epoch,10,:trn,(0.11236667f0,2.3013618f0),:tst,(0.1135f0,2.301291f0))
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#51>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABvNpqq65LOJT1mZ2XX2jyQavFIAL9Khks5rINeRgaJpZM4LNvuF>
.
|
Sure, here they are:
|
Merged
Ozan pushed a fix. Can you recheck?
…On Thu, Dec 15, 2016, 15:26 ngphuoc ***@***.***> wrote:
Sure, here they are:
julia> versioninfo()
Julia Version 0.5.1-pre+31
Commit 6a1e339 (2016-11-17 17:50 UTC)
Platform Info:
System: Linux (x86_64-linux-gnu)
CPU: Intel(R) Xeon(R) CPU E5-1660 v3 @ 3.00GHz
WORD_SIZE: 64
BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
LAPACK: libopenblas64_
LIBM: libopenlibm
LLVM: libLLVM-3.7.1 (ORCJIT, haswell)
julia> Pkg.status()
27 required packages:
- ArgParse 0.4.0
- AutoGrad 0.0.4
- CUDArt 0.2.3
- Convex 0.4.0
- DataFrames 0.8.5
- Distributions 0.11.1
- FileIO 0.2.0
- GR 0.18.0
- Glob 1.0.2
- HDF5 0.7.0
- ImageMagick 0.1.8
- JLD 0.6.6
- JSON 0.8.0
- Knet 0.8.1
- LegacyStrings 0.1.1
- LinearLeastSquares 0.1.0
- MAT 0.3.1
- MLBase 0.6.0
- MXNet 0.1.0
- Memcache 0.1.0
- Optim 0.7.1
- Plots 0.10.2
- PyPlot 2.2.4
- RDatasets 0.2.0
- SparseVectors 0.4.2
- StatsBase 0.11.1
- StatsFuns 0.3.1
49 additional packages:
- ArrayViews 0.6.4
- BinDeps 0.4.5
- Blosc 0.1.7
- BufferedStreams 0.2.0
- Calculus 0.1.15
- ColorTypes 0.2.12
- ColorVectorSpace 0.1.11
- Colors 0.6.9
- Compat 0.9.5
- Conda 0.4.0
- DataArrays 0.3.10
- DataStructures 0.4.6
- DiffBase 0.0.2
- FixedPointNumbers 0.2.1
- FixedSizeArrays 0.2.5
- Formatting 0.2.0
- ForwardDiff 0.3.3
- GZip 0.2.20
- Graphics 0.1.3
- Hiccup 0.0.3
- Images 0.5.14
- Iterators 0.2.0
- Juno 0.2.5
- LaTeXStrings 0.2.0
- Lazy 0.11.4
- Libz 0.2.0
- LineSearches 0.1.2
- MacroTools 0.3.2
- MathProgBase 0.5.8
- Measures 0.0.3
- Media 0.2.4
- NaNMath 0.2.2
- PDMats 0.5.2
- PlotThemes 0.1.0
- PlotUtils 0.2.0
- PositiveFactorizations 0.0.3
- PyCall 1.7.2
- RData 0.0.4
- RecipesBase 0.1.0
- Reexport 0.0.3
- Rmath 0.1.5
- SHA 0.3.0
- SIUnits 0.1.0
- Showoff 0.0.7
- SortingAlgorithms 0.1.0
- TexExtensions 0.0.3
- TextWrap 0.1.6
- URIParser 0.1.6
- Zlib 0.1.12
% nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2015 NVIDIA Corporation
Built on Tue_Aug_11_14:27:32_CDT_2015
Cuda compilation tools, release 7.5, V7.5.17
% cat /usr/local/cuda/include/cudnn.h | grep CUDNN_MAJOR -A 2#define CUDNN_MAJOR 5#define CUDNN_MINOR 1#define CUDNN_PATCHLEVEL 3
--#define CUDNN_VERSION (CUDNN_MAJOR * 1000 + CUDNN_MINOR * 100 + CUDNN_PATCHLEVEL)
#include "driver_types.h"
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#51 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABvNpnCj7-HltohQF5m9z_v4NI1FTO3Fks5rIN2dgaJpZM4LNvuF>
.
|
I have checked. It works. Thanks a lot. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I tried to run optimizers.jl and got a lot of the following warnings:
It finally ran but the result seems wrong:
The other examples ran correctly without warning.
The text was updated successfully, but these errors were encountered: