Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Convert completion chunk API response to OpenAI-compatible event stream #1094

Merged
merged 14 commits into from
Dec 28, 2023

Conversation

boxbeam
Copy link
Contributor

@boxbeam boxbeam commented Dec 21, 2023

Closes #1076

crates/tabby/src/routes/chat.rs Outdated Show resolved Hide resolved
crates/tabby/src/services/chat.rs Outdated Show resolved Hide resolved
crates/tabby/src/routes/chat.rs Outdated Show resolved Hide resolved
crates/tabby/src/services/chat.rs Outdated Show resolved Hide resolved
crates/tabby/src/routes/chat.rs Outdated Show resolved Hide resolved
id,
created,
object: "chat.completion.chunk",
model: "TabbyML",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
model: "TabbyML",
model: "unused-model",

@wsxiaoys
Copy link
Member

Please also fix all ci breakages, e.g Semantic PR

@boxbeam boxbeam changed the title Convert completion chunk API response to OpenAI-compatible event stream feat: Convert completion chunk API response to OpenAI-compatible event stream Dec 27, 2023
@wsxiaoys wsxiaoys enabled auto-merge (squash) December 27, 2023 23:49
crates/tabby/src/services/chat.rs Outdated Show resolved Hide resolved
crates/tabby/src/services/chat.rs Outdated Show resolved Hide resolved
@wsxiaoys wsxiaoys merged commit b0877af into main Dec 28, 2023
3 checks passed
@wsxiaoys wsxiaoys deleted the completion-chunk-event-stream branch December 28, 2023 00:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Making /v1beta/chat/completions streaming output compatible with openai
2 participants