support multiple GPU training for XTTS #3391
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Set the parameter in
recipes/ljspeech/xtts_v2/train_gpt_xtts.py
Now we can run a multi-gpu training using DDP back-end like this:
$ CUDA_VISIBLE_DEVICES="0, 1" python -m trainer.distribute --script recipes/ljspeech/xtts_v2/train_gpt_xtts.py
I ran an experiment with 2 GPUs. Everything seems fine.
I saw a TODO here. But I'm not very familiar with this function. I’m not sure if there will be any problems here compared to single GPU training.