-
Notifications
You must be signed in to change notification settings - Fork 19.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multi-input RNN and TimeDistributed wrapper #3432
Conversation
Also see #3057 |
ping @fchollet |
x = tf.transpose(x, (axes)) | ||
input_list += [tf.unpack(x)] | ||
assert len(set(map(len, input_list))) == 1, "All input sequences should be of equal length." | ||
input_list = [map(lambda x: x[t], input_list) for t in range(len(input_list[0]))] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If feel like this should be handled more elegantly in the above loop. This line is not very readable.
ndim = len(x.get_shape()) | ||
assert ndim >= 3, "Input should be at least 3D." | ||
for t in range(input_length): | ||
input_list[t] += [x[(slice(None), t) + (slice(None),) * (ndim - 2)]] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indentation issue. Are you using a linter (like pylint)?
Can you describe what is the motivation for this feature? Would it be used anywhere? |
I currently use it in a couple of custom layers. The RNN scans through multiple sequences, and at each time step, an element from only one of the sequences is chosen, based on a hidden state. |
inputs = tf.transpose(inputs, (axes)) | ||
input_list = tf.unpack(inputs) | ||
if type(inputs) == list: | ||
input_list = [[]] * input_length |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
>>> input_list = [[]] * 10
>>> input_list[0] += [1]
>>> input_list
[[1], [1], [1], [1], [1], [1], [1], [1], [1], [1]]
Probably not what you intended.
Here's another use case for this feature: we are building reading comprehension models and memory networks using Keras. To answer a question, we perform some multi-step operations on the input question, answer candidate, and a passage of text. If the question is multiple choice, it would be really convenient to be able to just |
@matt-gardner Unfortunately, this PR is way too old for me to get back at it. You can add your unpacking logic as a Lambda layer to a model, and then add your layer to the model. Since models are layers by inheritance, you can timedistribute the model. |
Theano scan op allows looping through multiple sequences simultaneously, but this feature is currently hidden by Keras's abstraction, forcing users to concatenate sequences(followed by unpacking them in the step function).
inputs
argument can be a tensor or a list of tensors with same input lengths.