Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

batch_size for MNISTDataModule #171

Closed
dedeswim opened this issue Aug 26, 2020 · 4 comments 路 Fixed by #331
Closed

batch_size for MNISTDataModule #171

dedeswim opened this issue Aug 26, 2020 · 4 comments 路 Fixed by #331
Labels
enhancement New feature or request help wanted Extra attention is needed
Milestone

Comments

@dedeswim
Copy link

馃殌 Feature

Add the batch_size parameter to the MNISTDataModule and BinaryMNISTDataModule.

Motivation

When using the MNISTDataModule there is no way to set the batch size if it is used directly within PyTorch Lightning (i.e. as argument to Trainer.fit or as _datamodule field inside a LightningModule).

Pitch

I would like to be able to set the batch size when initializing an MNISTDataModule as I can with many other DataModules right now (like CIFAR10DataModule or ImagenetDataModule).

Alternatives

An alternative to set the batch size would be not to feed the DataModule to the trainer directly (or using it as _datamodule field), but to use the separate test_dataloader, train_dataloader and val_dataloader methods separately. However I think ti would be against one of the points of using a DataModule.

Additional context

A possible implementation could be like that shown on Lightning's docs. I would be available to open a PR and work on this.

@dedeswim dedeswim added enhancement New feature or request help wanted Extra attention is needed labels Aug 26, 2020
@github-actions
Copy link

Hi! thanks for your contribution!, great first issue!

@nateraw
Copy link
Contributor

nateraw commented Sep 11, 2020

Done 馃槃 ...you got num_workers for free too!

@nateraw nateraw closed this as completed Sep 11, 2020
@hecoding
Copy link
Contributor

Hi, it's not actually working. Still taking batch_size from the method parameter.
I could make a silly PR if anyone interested.

@hecoding
Copy link
Contributor

hecoding commented Nov 3, 2020

Well I was the one interested :)
Changes are in the PR above ^

I need some reviewers. Should I create a new issue or something? (not much experience with contributions)

@Borda Borda added this to the v0.3 milestone Jan 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants