Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Excessive memory usage during inference #35

Closed
liclshixiaokeaiya opened this issue Aug 10, 2022 · 3 comments
Closed

Excessive memory usage during inference #35

liclshixiaokeaiya opened this issue Aug 10, 2022 · 3 comments

Comments

@liclshixiaokeaiya
Copy link

When I use the trained model for inference, there will be a problem that a certain image occupies too much GPU memory. The peak may reach more than 10G.

@lkeab
Copy link
Collaborator

lkeab commented Aug 10, 2022

You can slightly reduce the number of limit here if you have a tight memory.

@liclshixiaokeaiya liclshixiaokeaiya mentioned this issue Aug 10, 2022
Open
@liclshixiaokeaiya
Copy link
Author

Do I need to retrain after reduce limit ?

@lkeab
Copy link
Collaborator

lkeab commented Aug 10, 2022

No.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants