Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A bug shows when the batch_size sets 1 #18

Open
liuhui0401 opened this issue Nov 30, 2021 · 1 comment
Open

A bug shows when the batch_size sets 1 #18

liuhui0401 opened this issue Nov 30, 2021 · 1 comment

Comments

@liuhui0401
Copy link

When I set batch_size 1, a bug shows as "ValueError: Expected more than 1 value per channel when training, got input size torch.Size([1, 512, 1, 1])". I wonder how to solve this problem.

@liuhui0401
Copy link
Author

ok, I got it. The batchnorm layer needs more than one sample to calculate the parameters in it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant