Skip to content

Streaming TTS support for other backends #9051

@vinayakv22

Description

@vinayakv22

Is your feature request related to a problem? Please describe.
Trying to get streaming TTS on qwen3-tts and it runs on the backend rocm-qwen-tts. but voxcpm is the only backend that currently supports streaming TTS so can't use streaming TTS even though the model/backend supports it

Describe the solution you'd like
Streaming TTS enabled for all backends that are able to do it

Describe alternatives you've considered
None

Additional context
https://github.com/vllm-project/vllm-omni/tree/main/examples/offline_inference/qwen3_tts#streaming-mode

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions