You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I was building a network that processes sequences of sequences (batch_size, n_sequences, n_elements, element_size) with a TimeDistributed RNN layer. The following Gist shows a minimal example: https://gist.github.com/sjebbara/620d9ac4fc389a3f454444ec76c764f3
The odd thing is that the network compiles if dropout_W and dropout_U of the RNN (here GRU) are set to 0. As soon as one of them is greater than 0 the models model._make_predict_function() throws an error with the message:
theano.gof.fg.MissingInputError:
An input of the graph, used to compute Subtensor{::, int64, ::}(<TensorType(float32, 3D)>, Constant{0}), was not provided and not given a value.Use the Theano flag exception_verbosity='high',for more information on this error.`
The error emerges here:
...
File "/homes/sjebbara/git/keras-original/keras/engine/topology.py", line 514, in call
self.add_inbound_node(inbound_layers, node_indices, tensor_indices)
File "/homes/sjebbara/git/keras-original/keras/engine/topology.py", line 572, in add_inbound_node
Node.create_node(self, inbound_layers, node_indices, tensor_indices)
File "/homes/sjebbara/git/keras-original/keras/engine/topology.py", line 149, in create_node
output_tensors = to_list(outbound_layer.call(input_tensors[0], mask=input_masks[0]))
File "/homes/sjebbara/git/keras-original/keras/layers/wrappers.py", line 120, in call
initial_states=[])
File "/homes/sjebbara/git/keras-original/keras/backend/theano_backend.py", line 887, in rnn
go_backwards=go_backwards)
Since it only arises when dropout is applied I suspect the error to originate from the get_constants method in the GRU.
Please make sure that the boxes below are checked before you submit your issue. Thank you!
Check that you are up-to-date with the master branch of Keras. You can update with:
pip install git+git://github.com/fchollet/keras.git --upgrade --no-deps
If running on Theano, check that you are up-to-date with the master branch of Theano. You can update with:
pip install git+git://github.com/Theano/Theano.git --upgrade --no-deps
Provide a link to a GitHub Gist of a Python script that can reproduce your issue (or just copy the script here if it is short).
The text was updated successfully, but these errors were encountered:
Hi,
I was building a network that processes sequences of sequences
(batch_size, n_sequences, n_elements, element_size)
with a TimeDistributed RNN layer. The following Gist shows a minimal example:https://gist.github.com/sjebbara/620d9ac4fc389a3f454444ec76c764f3
The odd thing is that the network compiles if dropout_W and dropout_U of the RNN (here GRU) are set to 0. As soon as one of them is greater than 0 the models model._make_predict_function() throws an error with the message:
The error emerges here:
Since it only arises when dropout is applied I suspect the error to originate from the get_constants method in the GRU.
Please make sure that the boxes below are checked before you submit your issue. Thank you!
pip install git+git://github.com/fchollet/keras.git --upgrade --no-deps
pip install git+git://github.com/Theano/Theano.git --upgrade --no-deps
The text was updated successfully, but these errors were encountered: