-
Notifications
You must be signed in to change notification settings - Fork 3.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Loading XTTS via Xtts.load_checkpoint() #3177
Comments
I am encountering the same problem. |
Same |
+1 |
Delete tts_models--multilingual--multi-dataset--xtts_v2 folder and let the model download again. Fixed the issue for me. |
redownload the model can fix the problem,but i have found the wav's quality of this method is worse than the method of using api,don't know why |
I have checked how TTS api is loading the same exact model and it's different from code example from documentation: https://github.com/coqui-ai/TTS/blob/dev/docs/source/models/xtts.md This is how i managed to load this model without errors: from pathlib import Path
from TTS.tts.models import setup_model as setup_tts_model
from TTS.config import load_config
model_dir = Path("/home/user/.local/share/tts/tts_models--multilingual--multi-dataset--xtts_v2")
config = load_config(model_dir / "config.json")
model = setup_tts_model(config)
model.load_checkpoint(config,
checkpoint_dir=model_dir,
eval=True,
# use_deepspeed=True,
)
model.to("cuda") |
@Aya-AlJafari can you check the code above? It should have worked. |
This issue happens because the loaded model is not using the decoding parameters that are on config.json. You need to manually set them. Example:
|
I think the issue was caused by my mistake of not providing the right config file. After loading the
|
@erogol checked and it works. |
The above code fragment and my use started to fail with this. Was working fine until just last hour or so.
|
Describe the bug
When loading the model using
Xtts.load_checkpoint
, exception is raised asError(s) in loading state_dict for Xtts
, which leads to missing keys GPT embedding weights and size mismatch on Mel embedding. Even tried providing the directory which had base(v2) model checkpoints and got the same result.To Reproduce
Expected behavior
Load the checkpoint and run inference without exception.
Logs
Environment
Additional context
No response
The text was updated successfully, but these errors were encountered: