-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to run inference on larger frames e.g. 360p? #12
Comments
If out-of-memory, try to reduce |
Hi @JingyunLiang ! Thanks for the quick reply. Unfortunately, regardless of the tile size that I use, I'm still getting an OOM error. Here's some sample code to replicate, first to download a video from YT and extract frames, and then to run
Then running your test code
|
Maybe loading the whole video into GPU consumes too much GPU memory? |
You were right. Loading up more than 5 seconds clips was killing the GPU. Using |
Yes, it is slow is if you test it patch by patch. Testing different patches in parallel can help. |
Hola! Thanks for the great work with VRT. I wanted to know if you have any tips and recommendations to how we can run your evaluation code against our own higher resolution frames. It seems from my tests that anything above 180p just runs OOM in a K80 (12G) and a T4 (16G) regardless of the tile size that I use for all models (REDS, Vimeo, etc.). Do you have any advice? Thanks!
The text was updated successfully, but these errors were encountered: