Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OAI Streaming support #36

Closed
GithubPrankster opened this issue Apr 7, 2023 · 2 comments
Closed

OAI Streaming support #36

GithubPrankster opened this issue Apr 7, 2023 · 2 comments

Comments

@GithubPrankster
Copy link

GithubPrankster commented Apr 7, 2023

Is your feature request related to a problem? Please describe.
I liked the streaming implemented in TAI-Turbo since I could see the reply coming in which felt natural and intuitive.

Describe the solution you'd like
Snooping around the code reveals you were probably trying to get it working in your fork at some point. The solution would be to have that happen. (i wanted to look into it to maybe make a PR but I know jack about JS)

Describe alternatives you've considered
I tend to still use TAI-Turbo just out of liking that feature....

Additional context
(I was curious about the OAI model being used but seems you can change it just fine now.)

@Cohee1207
Copy link
Member

Streaming is considered once all technical blockers are resolved (swipes and conflicts with other generation backends).
AFAIK gpt-3.5-turbo / gpt-3.5-turbo-0301 are equivalent right now

@GithubPrankster
Copy link
Author

Saw that it's implemented in dev branch now and working very well. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants