Skip to content

RuntimeError: CUDA out of memory. #1005

Discussion options

You must be logged in to vote

Is these errors above, are really due to a GPU memory issue?

Yes, the error was due to lack of GPU memory.

s my GPU able to handle training of TTS models? Or do I need to get a better machine with a better GPU to achieve good results in a TTS task?

4GB is really not enough for training TTS models, and it also largely depends on the model you are in interest. The glow-tts is memory hungry. Besides, the sentences in your dataset are long (max 823, avg 238 phones). The longer sentence consumes larger memory. You might be able to train a Tacotron2 by max_seq_len=150 and batch_size=10, but you may lose a lot of your sentences. The smaller batch_size is ok, but it is good to keep it above 10.

Replies: 4 comments 5 replies

Comment options

You must be logged in to vote
1 reply
@ricardohalla
Comment options

Answer selected by ricardohalla
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
4 replies
@loganhart02
Comment options

@loganhart02
Comment options

@loganhart02
Comment options

@tekinek
Comment options

Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
4 participants