We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
因为我电脑最多支持128的batchsize,所以我把batchsize设置为128,这样会影响精确度么 会的 最好还是用1024的batch size
请问1024的batchsize需要多大的显存
Originally posted by @tangjialiang-jj in #10 (comment)
我跑cifar10 两个1080ti就可
The text was updated successfully, but these errors were encountered:
请问你用两块1080ti跑的时候还做了其他的并行化操作么,我用两块1080ti运行程序,最多把batchsize设置为256
Sorry, something went wrong.
因为我电脑最多支持128的batchsize,所以我把batchsize设置为128,这样会影响精确度么 会的 最好还是用1024的batch size 请问1024的batchsize需要多大的显存 Originally posted by @tangjialiang-jj in #10 (comment) 我跑cifar10 两个1080ti就可
请问你在用两块1080ti运行程序的时候是否还做了其他的操作,我用我的两块1080ti运行时batchsize最多设置为256
No branches or pull requests
请问1024的batchsize需要多大的显存
Originally posted by @tangjialiang-jj in #10 (comment)
我跑cifar10 两个1080ti就可
The text was updated successfully, but these errors were encountered: