-
Notifications
You must be signed in to change notification settings - Fork 452
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bugfix: convert several providers stop_reason to valid openai finish_reason #363
Conversation
1e5a6ee
to
ae228b9
Compare
351fff8
to
ff93296
Compare
ff93296
to
b5a650e
Compare
hello @vrushankportkey @VisargD, i'm not sure if there is a process after opening a PR to request a review, any chance this PR could be considered ? |
Thank you so much @unsync! Yes, we're checking the PR and will share review if any in some time! Really appreciate you pushing this bugfix 🙌 |
b5a650e
to
05eff29
Compare
@@ -345,6 +345,22 @@ export const AnthropicErrorResponseTransform: ( | |||
return undefined; | |||
}; | |||
|
|||
// this converts the anthropic stop_reason to an openai finish_reason | |||
const getFinishReason = (stopReason?: string): string | null => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Recommend creating a type
type AnthropicStopReason = 'max_tokens' | 'stop_sequence' | 'end_turn' | ....
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also recommend using a switch/case
@@ -394,7 +410,8 @@ export const AnthropicChatCompleteResponseTransform: ( | |||
}, | |||
index: 0, | |||
logprobs: null, | |||
finish_reason: response.stop_reason, | |||
finish_reason: | |||
getFinishReason(response.stop_reason) || response.stop_reason, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should just be getFinishReason(response.stop_reason)
, Otherwise it defeats the purpose of strictly typing the response
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the Pull Request! Requested a few changes
@@ -226,6 +226,28 @@ interface GoogleGenerateContentResponse { | |||
}; | |||
} | |||
|
|||
// this converts the google stop_reason to an openai finish_reason | |||
// https://ai.google.dev/api/python/google/ai/generativelanguage/Candidate/FinishReason | |||
const getFinishReason = (stopReason?: string): string | null => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comments as in the getFinishReason transformer function above.
- Use switch/case
- Define types for GoogleStopReason
@@ -291,7 +313,9 @@ export const GoogleChatCompleteResponseTransform: ( | |||
return { | |||
message: message, | |||
index: generation.index, | |||
finish_reason: generation.finishReason, | |||
finish_reason: | |||
getFinishReason(generation.finishReason) ?? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should only be
finish_reason:
getFinishReason(generation.finishReason)
default case should be handled inside getFinishReason
@narengogi that is great ! |
Thanks @unsync & @narengogi! |
Description:
Currently, we forward the anthropic
stop_reason
, which is not compatible with the openaifinish_reason
.This can cause warnings from some OpenAI sdk.
The conversion function returns
stop
as a default value in case newstop_reason
are added.There was also a bug in the groq and google providers for the
finish_reason
Side note:
It would probably be beneficial to update the types to we can get compilation-time validation :
gateway/src/providers/types.ts
Lines 101 to 123 in 77f32e5
Proposed change :
see https://platform.openai.com/docs/guides/text-generation/chat-completions-response-format