zero.data.iloader

zero.data.iloader(size, *args, **kwargs)[source]

Make DataLoader over batches of indices.

The shuffling logic is fully delegated to native PyTorch DataLoader, i.e. no custom logic is performed under the hood.

Parameters
Raises

AssertionError – if size is not positive

Return type

torch.utils.data.dataloader.DataLoader

See also

iter_batches

Examples

Usage for training:

train_loader = iloader(len(train_dataset), batch_size, shuffle=True)
for epoch in epochs:
    for batch_idx in train_loader:
        ...

Other examples:

dataset_size = 10  # len(dataset)
for batch_idx in iloader(dataset_size, batch_size=3):
    print(batch_idx)
tensor([0, 1, 2])
tensor([3, 4, 5])
tensor([6, 7, 8])
tensor([9])
dataset_size = 10  # len(dataset)
for batch_idx in iloader(dataset_size, 3, drop_last=True):
    print(batch_idx)
tensor([0, 1, 2])
tensor([3, 4, 5])
tensor([6, 7, 8])