Skip to content

Regarding torch.utils.data.DataLoader and batch_size #873

@jjong2ya

Description

@jjong2ya

Hi all the high-level engineers!

I am very new to pytorch and even to pythone.
Nonetheless, I am trying to understand pytorch by the documentation of the tutorial of pytorch.
Now I am going through TRAINING A CLASSIFIER using dataset, CIFAR10.

Code Link: https://pytorch.org/tutorials/beginner/blitz/cifar10_tutorial.html

In the code, there is the lines as follows;

for i, data in enumerate(trainloader, 0):
inputs, labels=data

I have checked the size of data[0] and input[0], and they are different and it was [4, 3, 32, 32] and [3, 32, 32] respectively.
I understand that data[0] has the size of [4, 3, 32, 32] as it is the first batch that contains 4 images.

Question

  1. But why is the size of input[0] [3, 32, 32]? As I checked, input[0] is the first image of the data[0].
    Why is input[0] only taking the first image of data[0]? According to the code, input[0]=data[0], shouldn't it ?

  2. According to the tutorial, "data is a list of [inputs, labels]". But I don't see labels[0] value in data[0].
    Why is so?

Sorry if it is too basic, but please help.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions