-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Open
Labels
Description
Is your feature request related to a problem? Please describe.
Trying to get streaming TTS on qwen3-tts and it runs on the backend rocm-qwen-tts. but voxcpm is the only backend that currently supports streaming TTS so can't use streaming TTS even though the model/backend supports it
Describe the solution you'd like
Streaming TTS enabled for all backends that are able to do it
Describe alternatives you've considered
None
Additional context
https://github.com/vllm-project/vllm-omni/tree/main/examples/offline_inference/qwen3_tts#streaming-mode
Reactions are currently unavailable