You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I believe this is a way to improve. I'll try to join the Continue Discord for questions
I'm not able to find an open issue that requests the same enhancement
Problem
I'm using the Continue.dev extension with Bedrock, and I've noticed that the experience can be slow, especially when generating longer responses from large language models. This is likely because the extension doesn't leverage streaming for Bedrock, which means the entire response has to be generated before it's returned.
Streaming can significantly improve performance and responsiveness by allowing the model to return its output token by token, without waiting for the entire response to be generated. This way, the user can start seeing the output as it's being generated, rather than waiting for the entire response to be completed.
I would like to request the addition of streaming support for Bedrock, especially Anthropic's models. This would greatly enhance the user experience, especially when working with longer prompts or generating more extensive outputs.
Solution
Add streaming support for Bedrock models.
The text was updated successfully, but these errors were encountered:
Validations
Problem
I'm using the Continue.dev extension with Bedrock, and I've noticed that the experience can be slow, especially when generating longer responses from large language models. This is likely because the extension doesn't leverage streaming for Bedrock, which means the entire response has to be generated before it's returned.
Streaming can significantly improve performance and responsiveness by allowing the model to return its output token by token, without waiting for the entire response to be generated. This way, the user can start seeing the output as it's being generated, rather than waiting for the entire response to be completed.
I would like to request the addition of streaming support for Bedrock, especially Anthropic's models. This would greatly enhance the user experience, especially when working with longer prompts or generating more extensive outputs.
Solution
Add streaming support for Bedrock models.
The text was updated successfully, but these errors were encountered: