New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Different batch_size in training and testing for Bi-directional RNN #14
Comments
Yes, you can change batch size at any time, while training or testing. For more simplicity, you can use master version of tensorflow, where you don't need to provides seq_length. So you can directly use:
This function is then independent from batch_size. If you are using the 0.6.0 version, and want to change batch size, you need to change a little the function and provides batch_size as a placeholder:
|
Thanks Aymeric. Your suggestions really helped me. I was trying to make the batch_size in a placeholder. I did not figure out how it works. Combined with the early stop tech working with variant length of input, I think now the bi-directional RNN module is completed. Thanks for the efforts. |
Glad to hear :) I close the issue then. |
Hi,
I am working closed to bi-directional RNN. It is so great to find your demo code here. However, I noticed that in your bi-directional RNN examples, the batch size for training and testing should be remain the same. It is due to the new tensorflow 0.6.0 asks for setting the seq_len with a constant value which is equal to batch_size.
I totally understand that when we unroll the network the batch_size should be the sequence length. But in testing, can we adjust the batch size to a different value? how to implement it?
Is that possible to re-initialize the rnn with different configurations, saying different batch_size?
Thanks very much.
The text was updated successfully, but these errors were encountered: