New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TF 0.9: "This module is deprecated. Use tf.nn.rnn_* instead." #41
Comments
But got this warning: WARNING:tensorflow:<tensorflow.python.ops.rnn_cell.BasicLSTMCell object at 0x101e360b8>:
Using a concatenated state is slower and will soon be deprecated. Use state_is_tuple=True. Need to inspect more. Please let me know if you have some ideas. |
I'm not positive but I think that I have an idea of whats going on. For the initial state of the network (lets say we're using a gru), tensorflow can pass in one matrix of shape (batch_size x hidden_dim) which represents the z_state and another matrix of shape (batch_size x hidden_dim) representing the r_state. Instead, it could pass in a single matrix of size (batch_size x 2 * hidden dim), perform the dot product and slice out the r and z components of the single hidden dim. Usually the latter is faster but I guess that tensorflow must have some new optimization making it faster to have two separate state matrices instead of a single one. Do you get an error when you set state_is_tuple=True? If not, this might be the correct way to do rnns in TF from now on. |
Thanks! I think I'll have time to look into this problem next week. |
was a solution found? |
I think #46 fixed it. Please let us know if it does not work. Thanks! |
I think we need to update lstm:
The text was updated successfully, but these errors were encountered: