Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] option to make batch size fixed #399

Open
khaotik opened this issue Aug 6, 2017 · 1 comment
Open

[Feature Request] option to make batch size fixed #399

khaotik opened this issue Aug 6, 2017 · 1 comment

Comments

@khaotik
Copy link

khaotik commented Aug 6, 2017

I see this in SequentialScheme docstring:

 |  Notes
 |  -----
 |  The batch size isn't enforced, so the last batch could be smaller.

In libraries such as Tensorflow, where tensor shape is static by nature, I find this behavior is causing headache.

@dmitriy-serdyuk
Copy link
Contributor

Agree. It should be an option to drop the last batch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants