Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OOM && Large GPU usage #50

Closed
arnavmehta7 opened this issue Jan 30, 2023 · 4 comments
Closed

OOM && Large GPU usage #50

arnavmehta7 opened this issue Jan 30, 2023 · 4 comments

Comments

@arnavmehta7
Copy link
Contributor

Hi, I am using this, but it doesn't seems to release the gpu vram. I am using large-v2 model and it's using 16gigs of GPU ram. Which I don't think is NORMAL at all, moreover it doesn't free up the ram. Is this normal?

@m-bain
Copy link
Owner

m-bain commented Jan 31, 2023

Memory scales with length of input segment to wav2vec2 model. With current version, adding
--vad_filter
Ensures that no segments greater than 30s are inputted to the w2v model. You can further reduce gpu reqs by
a. Using smaller w2v align_model
b. Limit input segment duration

I will add b as a new feature at some point

@arnavmehta7
Copy link
Contributor Author

I am using a older commit, and its now using 28gigs of GPU ram, which is TOOOO much. Any idea how to fix this?

@arnavmehta7
Copy link
Contributor Author

Files are being processed in a loop, and seems like after every file, GPU ram usage pops up by ~2gbs ram.

@arnavmehta7
Copy link
Contributor Author

@m-bain I just ran a very long audio and it showed that it tried to allocate 30gbs of GPU ram, is there any clean fast fix?
Thankyou!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants