You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The attention plot will be of diagonal shape, and the synthesis not too bad but will have the problem mentioned in the paper: may skip letters or pronounce several times parts of words.
The text was updated successfully, but these errors were encountered:
If you try, in synthesis, to save and show Attention computed with the model pretrained on LJ-speech for example, it will look like this:
Why is it horizontal and not diagonal like during the training ? The synthesis works just fine though ...
If I comment, in "networks.py", in the function "Attention" the part corresponding to "monotonic attention" like this:
The attention plot will be of diagonal shape, and the synthesis not too bad but will have the problem mentioned in the paper: may skip letters or pronounce several times parts of words.
The text was updated successfully, but these errors were encountered: