New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it possible to use smaller GPU for inference? #8
Comments
Hi @AmitRozner |
My gpu is V100, 16G, when I run |
Hi @cqlyiyeshu |
@lijiannuist My gpu is 8G,how can i solve the error "RuntimeError: CUDA out of memory "? |
@lijiannuist Do you mean resizing the input image to a smaller size? |
yes,i resize as 100x100 |
Hi, try to add |
You can use this https://github.com/vlad3996/FaceDetection-DSFD with original author's checkpoint |
|
用0.4+的torch确实会报内存不够,请问有什么方法可以释放内存嘛?我现在只能一张图片一张图片地测。 |
I read about you training 8 images in a batch on P40. Is it possible to use the code with GTX 1080TI (12GB) with smaller batch size?
The text was updated successfully, but these errors were encountered: