Skip to content
This repository has been archived by the owner on Jan 3, 2023. It is now read-only.

Support differing batch sizes at train and test time #6

Closed
scttl opened this issue May 1, 2015 · 1 comment
Closed

Support differing batch sizes at train and test time #6

scttl opened this issue May 1, 2015 · 1 comment
Labels

Comments

@scttl
Copy link
Contributor

scttl commented May 1, 2015

Due to the pre-allocation of intermediate matrices (like pre-activations), the settings used for batch sizes at training time, will automatically be expected at test time. This creates problems if we expect these to differ (ex. no mini-batch training where the batch size is the number of records).

@scttl scttl added the bug label May 1, 2015
@scttl
Copy link
Contributor Author

scttl commented Jul 6, 2015

As batch_size is no longer stored as part of model parameters, we are successfully able to train a model with one batch size, save it, load it, then use it to generate predictions of a different batch_size.

@scttl scttl closed this as completed Jul 6, 2015
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

1 participant