We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
You can continue the conversation there. Go to discussion →
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
This repository is really great! Decent samples too.
I see a huge opportunity for this to be extended to support mobile.
There are a number of obstacles to this of course, including running on TF-Lite.
If you ported it to Dart you could transpile it to iOS and Android.
The text was updated successfully, but these errors were encountered:
Thx ! We also plan to expand 🐸 TTS's use in offline and low-compute devices.
Currently, some models we release are already real-time using simply the torch back-end.
Targeting the mobile, we can make use of Torch Script, ONNX or TF-Lite export. So one step at a time we plan to get there.
Actually you can already run the vanilla Tacotron on TF-Lite. A bit of documentation is here https://github.com/coqui-ai/TTS/wiki/Converting-Torch-Tacotron-to-TF-2.0
Sorry, something went wrong.
There are currently 3 FOSS TTSEs for Mobile https://search.f-droid.org/?q=tts+engine&lang=en A coqui-ai.apk could improve that short list.
No branches or pull requests
This repository is really great! Decent samples too.
I see a huge opportunity for this to be extended to support mobile.
There are a number of obstacles to this of course, including running on TF-Lite.
If you ported it to Dart you could transpile it to iOS and Android.
The text was updated successfully, but these errors were encountered: