Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Queue is not respecting the batch size #1173

Closed
1 task done
icekang opened this issue May 28, 2024 · 1 comment
Closed
1 task done

Queue is not respecting the batch size #1173

icekang opened this issue May 28, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@icekang
Copy link

icekang commented May 28, 2024

Is there an existing issue for this?

  • I have searched the existing issues

Bug summary

From the example, on the website, I have printed out the shape of patches_batch, its batch size is not the expected / specified value assigned to the data loader

Code for reproduction

# from https://torchio.readthedocs.io/patches/patch_training.html#queue
import torch
import torchio as tio
from torch.utils.data import DataLoader
patch_size = 96
queue_length = 300
samples_per_volume = 10
sampler = tio.data.UniformSampler(patch_size)
subject = tio.datasets.Colin27()
subjects_dataset = tio.SubjectsDataset(10 * [subject])
patches_queue = tio.Queue(
    subjects_dataset,
    queue_length,
    samples_per_volume,
    sampler,
    num_workers=4,
)
patches_loader = DataLoader(
    patches_queue,
    batch_size=16,
    num_workers=0,  # this must be 0
)
num_epochs = 2
model = torch.nn.Identity()
for epoch_index in range(num_epochs):
    for patches_batch in patches_loader:
        inputs = patches_batch['t1'][tio.DATA]  # key 't1' is in subject
        targets = patches_batch['brain'][tio.DATA]  # key 'brain' is in subject
        logits = model(inputs)  # model being an instance of torch.nn.Module

print(patches_batch['t1'][tio.DATA].shape, patches_batch['brain'][tio.DATA].shape)

Actual outcome

(torch.Size([4, 1, 96, 96, 96]), torch.Size([4, 1, 96, 96, 96]))

Error messages

No response

Expected outcome

(torch.Size([16, 1, 96, 96, 96]), torch.Size([16, 1, 96, 96, 96]))

System info

Platform:   Linux-6.5.0-35-generic-x86_64-with-glibc2.35
TorchIO:    0.19.6
PyTorch:    2.3.0+cu118
SimpleITK:  2.3.1 (ITK 5.3)
NumPy:      1.26.4
Python:     3.9.19 (main, May  6 2024, 19:43:03) 
[GCC 11.2.0]
@icekang icekang added the bug Something isn't working label May 28, 2024
@icekang icekang closed this as completed May 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants