🐛 Bug description
The following code will brake if batch_size is smaller than world size :
import torch
import ignite.distributed as idist
def foo(i):
data = torch.arange(100).reshape(25, 4)
data_loader = idist.auto_dataloader(data, batch_size=6, num_workers=12)
if __name__ == "__main__":
idist.spawn("gloo", foo, args=(), nproc_per_node=8)
ValueError: batch_size should be a positive integer value, but got batch_size=0
A fix can be:
- keep batch size as provided if smaller than world size
- same for
num_workers
Environment
- PyTorch Version (e.g., 1.4): 1.5.0
- Ignite Version (e.g., 0.3.0): master
- OS (e.g., Linux): linux
- How you installed Ignite (
conda, pip, source):
- Python version: 3.7
- Any other relevant information:
@InCogNiTo124 would you like to fix this as you recently played around auto_* ?