Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tensorflow error: you must feed a value for placeholder tensor 'placeholder_8' with dtype float #33

Closed
ExpandingMan opened this issue May 18, 2017 · 6 comments

Comments

@ExpandingMan
Copy link
Contributor

ExpandingMan commented May 18, 2017

I'm probably getting a bit ahead of things here since I'm using master of TensorFlow.jl as I'm on 0.6 and the latest release doesn't work. TensorFlow itself seems to be working just fine on master.

Anyway, I get the following error when attempting to train the MNIST example:

ERROR: LoadError: Tensorflow error: Status: You must feed a value for placeholder tensor 'placeholder_8' with dtype float
         [[Node: placeholder_8 = Placeholder[_class=[], dtype=DT_FLOAT, shape=[], _device="/job:localhost/replica:0/task:0/gpu:0"]()]]
         [[Node: placeholder_8/_17 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/cpu:0", send_device="/job:localhost/replica:0/task:0/gpu:0", send_device_incarnation=1, tensor_name="edge_3_placeholder_8", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/cpu:0"]()]]

Stacktrace:
 [1] check_status at /home/expandingman/.julia/v0.6/TensorFlow/src/core.jl:402 [inlined]
 [2] run(::TensorFlow.Session, ::Array{TensorFlow.Port,1}, ::Array{Any,1}, ::Array{TensorFlow.Port,1}, ::Array{Ptr{Void},1}) at /home/expandingman/.julia/v0.6/TensorFlow/src/run.jl:100
 [3] run(::TensorFlow.Session, ::Array{TensorFlow.Tensor{Float32},1}, ::Dict{Any,Any}) at /home/expandingman/.julia/v0.6/TensorFlow/src/run.jl:169
 [4] run(::TensorFlow.Session, ::TensorFlow.Tensor{Float32}, ::Dict{Any,Any}) at /home/expandingman/.julia/v0.6/TensorFlow/src/run.jl:187
 [5] back!(::Flux.TF.Exec, ::Array{Float32,2}, ::Array{Float64,2}) at /home/expandingman/.julia/v0.6/Flux/src/backend/tensorflow/model.jl:44
 [6] macro expansion at /home/expandingman/.julia/v0.6/Flux/src/training.jl:36 [inlined]
 [7] macro expansion at /home/expandingman/.julia/v0.6/Juno/src/progress.jl:128 [inlined]
 [8] macro expansion at /home/expandingman/.julia/v0.6/Flux/src/training.jl:15 [inlined]
 [9] macro expansion at /home/expandingman/.julia/v0.6/Juno/src/progress.jl:128 [inlined]
 [10] #train!#119(::Array{##3#4,1}, ::Int64, ::Float64, ::Function, ::Function, ::Flux.TF.Model, ::Array{Tuple{Array{Float64,1},Array{Int64,1}},1}) at /home/expandingman/.julia/v0.6/Flux/src/training.jl:29
 [11] (::Flux.#kw##train!)(::Array{Any,1}, ::Flux.#train!, ::Flux.TF.Model, ::Array{Tuple{Array{Float64,1},Array{Int64,1}},1}) at ./<missing>:0
 [12] include_from_node1(::String) at ./loading.jl:552
 [13] include(::String) at ./sysimg.jl:14
while loading /home/expandingman/src/test_flux.jl, in expression starting on line 22

As I suspect this is mostly the fault of my insisting on using TensorFlow master as a result of using 0.6, I'll probably look into patching this myself until the 0.6 ecosystem matures.

Update: It seems that placeholder_8 is a placeholder that is getting added to the TensorFlow graph object, but is disconnected from the actual graph (i.e. not used by any functions). Again, this is in the basic MNIST example. Apparently, it is the placeholder for the gradients.

@MikeInnes
Copy link
Member

Yes, this issue is a valid one with Flux. We changed our approach to gradients to make things more consistent but were waiting on malmaud/TensorFlow.jl#215 to complete it. Should be fixed soon.

@ExpandingMan
Copy link
Contributor Author

Thanks for the update. Indeed it seems that TensorFlow merged their pull request already. In that case I'll wait rather than trying to get this going through some hack.

By the way, the warnings about the dot operators seem very hard to fix. Will this require some way of passing multiple function arguments to graph, as in graph(::typeof(broadcast), ::typeof(+), args...) or will there be some simpler solution?

@MikeInnes
Copy link
Member

That's one option. Right now I'm trying to implement #31 in DataFlow.jl, which is a slightly different design. With that you'd dispatch on graph(::Broadcast{typeof(+)}, args...)

@MikeInnes
Copy link
Member

So now DataFlow just interprets f.(x) as broadcast(f, x). This is pretty easy to handle in Flux; the right graph method gets called in Flux already, so we just need to add the overloads.

@baggepinnen
Copy link
Contributor

baggepinnen commented Jun 16, 2017

I am still having this issue, has it been resolved on some branch other than the latest tagged version perhaps?

@MikeInnes
Copy link
Member

This should be fixed in the latest release.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants