Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support offline mobile #381

Closed
boxabirds opened this issue Mar 15, 2021 · 2 comments
Closed

Support offline mobile #381

boxabirds opened this issue Mar 15, 2021 · 2 comments

Comments

@boxabirds
Copy link

boxabirds commented Mar 15, 2021

This repository is really great! Decent samples too.

I see a huge opportunity for this to be extended to support mobile.

There are a number of obstacles to this of course, including running on TF-Lite.

If you ported it to Dart you could transpile it to iOS and Android.

@erogol
Copy link
Member

erogol commented Mar 15, 2021

Thx ! We also plan to expand 🐸 TTS's use in offline and low-compute devices.

Currently, some models we release are already real-time using simply the torch back-end.

Targeting the mobile, we can make use of Torch Script, ONNX or TF-Lite export. So one step at a time we plan to get there.

Actually you can already run the vanilla Tacotron on TF-Lite. A bit of documentation is here https://github.com/coqui-ai/TTS/wiki/Converting-Torch-Tacotron-to-TF-2.0

@Dmole
Copy link

Dmole commented Mar 24, 2021

There are currently 3 FOSS TTSEs for Mobile https://search.f-droid.org/?q=tts+engine&lang=en
A coqui-ai.apk could improve that short list.

@erogol erogol closed this as completed Mar 30, 2021
@coqui-ai coqui-ai locked and limited conversation to collaborators Mar 30, 2021

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants