Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Return streaming response from PaLM 2 ChatSession #2246

Open
brendanator opened this issue May 25, 2023 · 2 comments
Open

Return streaming response from PaLM 2 ChatSession #2246

brendanator opened this issue May 25, 2023 · 2 comments
Labels
api: vertex-ai Issues related to the googleapis/python-aiplatform API.

Comments

@brendanator
Copy link

Is your feature request related to a problem? Please describe.
I'm using vertexai.preview.language_models.ChatSession with chat-bison@001. There can be a high latency using send_message

Describe the solution you'd like
I'd like there to be able to get a streaming response so I can show the partial result immediately to the user

Describe alternatives you've considered
None

Additional context
This functionality is available with OpenAI and Anthropic

@product-auto-label product-auto-label bot added the api: vertex-ai Issues related to the googleapis/python-aiplatform API. label May 25, 2023
@HassanOuda
Copy link

Would also like this feature on chat and text bison!

@GargPriyanshu1112
Copy link

Is there a way to stream responses from Palm2 chatsession ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: vertex-ai Issues related to the googleapis/python-aiplatform API.
Projects
None yet
Development

No branches or pull requests

3 participants