zero.data.IndexLoader¶
- class zero.data.IndexLoader(size, *args, device='cpu', **kwargs)[source]¶
Like
DataLoader
, but over indices instead of data.The shuffling logic is delegated to the native PyTorch DataLoader, i.e. no custom logic is performed under the hood. The data loader which actually generates indices is available as
IndexLoader.loader
.Examples
Usage for training:
train_loader = IndexLoader(len(train_dataset), batch_size, shuffle=True) for epoch in epochs: for batch_idx in train_loader: ...
Other examples:
dataset_size = 10 # len(dataset) for batch_idx in IndexLoader(dataset_size, batch_size=3): print(batch_idx)
tensor([0, 1, 2]) tensor([3, 4, 5]) tensor([6, 7, 8]) tensor([9])
dataset_size = 10 # len(dataset) for batch_idx in IndexLoader(dataset_size, 3, drop_last=True): print(batch_idx)
tensor([0, 1, 2]) tensor([3, 4, 5]) tensor([6, 7, 8])
See also
Attributes
The underlying DataLoader.
Methods
Initialize self.
Get the size of the underlying DataLoader.