Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(ai-proxy): google-gemini support #12948

Open
wants to merge 9 commits into
base: master
Choose a base branch
from

Conversation

tysoekong
Copy link
Contributor

@tysoekong tysoekong commented Apr 27, 2024

Summary

Adds Google Gemini support to Kong AI Gateway.

Checklist

  • The Pull Request has tests
  • A changelog file has been created under changelog/unreleased/kong or skip-changelog label added on PR if changelog is unnecessary. README.md
  • There is a user-facing docs PR against https://github.com/Kong/docs.konghq.com - PUT DOCS PR HERE

Issue reference

AG-27

@tysoekong tysoekong marked this pull request as ready for review May 29, 2024 11:18
@tysoekong tysoekong requested review from locao, hanshuebner and jschmid1 and removed request for jschmid1 May 29, 2024 11:19
@tysoekong tysoekong changed the title DRAFT: feat(ai-proxy): google-gemini support feat(ai-proxy): google-gemini support May 29, 2024
Copy link

@EduardoEspinozaPerez EduardoEspinozaPerez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@@ -0,0 +1,9 @@
FROM kong:3.6.1
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please remove this file

end

local events = ai_shared.frame_to_events(chunk)
local events = ai_shared.frame_to_events(chunk, conf.model.provider == "gemini")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
local events = ai_shared.frame_to_events(chunk, conf.model.provider == "gemini")
local is_raw_json = conf.model.provider == "gemini"
local events = ai_shared.frame_to_events(chunk, is_raw_json )


-- some new LLMs return the JSON object-by-object,
-- because that totally makes sense to parse?!
if raw_json_mode then
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we need some unit tests for this if

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants