You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add the batch_size parameter to the MNISTDataModule and BinaryMNISTDataModule.
Motivation
When using the MNISTDataModule there is no way to set the batch size if it is used directly within PyTorch Lightning (i.e. as argument to Trainer.fit or as _datamodule field inside a LightningModule).
Pitch
I would like to be able to set the batch size when initializing an MNISTDataModule as I can with many other DataModules right now (like CIFAR10DataModule or ImagenetDataModule).
Alternatives
An alternative to set the batch size would be not to feed the DataModule to the trainer directly (or using it as _datamodule field), but to use the separate test_dataloader, train_dataloader and val_dataloader methods separately. However I think ti would be against one of the points of using a DataModule.
Additional context
A possible implementation could be like that shown on Lightning's docs. I would be available to open a PR and work on this.
The text was updated successfully, but these errors were encountered:
🚀 Feature
Add the
batch_size
parameter to theMNISTDataModule
andBinaryMNISTDataModule
.Motivation
When using the
MNISTDataModule
there is no way to set the batch size if it is used directly within PyTorch Lightning (i.e. as argument toTrainer.fit
or as_datamodule
field inside aLightningModule
).Pitch
I would like to be able to set the batch size when initializing an
MNISTDataModule
as I can with many otherDataModule
s right now (likeCIFAR10DataModule
orImagenetDataModule
).Alternatives
An alternative to set the batch size would be not to feed the
DataModule
to the trainer directly (or using it as_datamodule
field), but to use the separatetest_dataloader
,train_dataloader
andval_dataloader
methods separately. However I think ti would be against one of the points of using aDataModule
.Additional context
A possible implementation could be like that shown on Lightning's docs. I would be available to open a PR and work on this.
The text was updated successfully, but these errors were encountered: