Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do you ever use batch_size larger than 128 per GPU #50

Closed
PistonY opened this issue Dec 6, 2021 · 1 comment
Closed

Do you ever use batch_size larger than 128 per GPU #50

PistonY opened this issue Dec 6, 2021 · 1 comment

Comments

@PistonY
Copy link

PistonY commented Dec 6, 2021

My loss stuck at 0.985 when using norm_target and batch_size 256 per GPU, do you meet this?

@pengzhiliang
Copy link
Owner

I am sorry that I have not meet this problem, even if the batchsize is 256.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants