Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference is causing Out of Memory error (even for V100) #4

Closed
wind-surfer opened this issue Sep 29, 2021 · 4 comments
Closed

Inference is causing Out of Memory error (even for V100) #4

wind-surfer opened this issue Sep 29, 2021 · 4 comments

Comments

@wind-surfer
Copy link

Hello,
I tried running 'main_test_fbcnn_color.py' on a real JPEG image using one 16 GB V100 but the code threw ´Out of Memory´ error. Any idea on how to use this code with large images, say 12 MPix or more?

@wind-surfer wind-surfer changed the title Inference is causing Out of Memory (even for V100) Inference is causing Out of Memory error (even for V100) Sep 29, 2021
@jiaxi-jiang
Copy link
Owner

Hi, thanks for your interest!

You can try to either run on the CPU or split the large image into smaller ones, e.g. based on the code here:https://github.com/cszn/KAIR/blob/master/utils/utils_model.py#L172

@wind-surfer
Copy link
Author

Hello,
Thank you for your prompt reply. Do you think splitting->inference->merging will produce any artifacts at the edges of the image tile?

@jiaxi-jiang
Copy link
Owner

Hi, I don't think so. I once tested large images using this method and did not see produced artifacts at the edges.

@wind-surfer
Copy link
Author

Sure. Thank you for your prompt response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants