feat: Add Google Gemini API support to Lambda proxy #2
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
This pull request extends the existing Lambda proxy to support the Google Gemini API alongside the current OpenAI functionality. The implementation adheres to the provided gold standard for Gemini API interaction and ensures that existing OpenAI features remain unaffected.
Changes Implemented
@google/generative-ai
: Added the necessary library for Gemini API communication.openai_servers.yaml
: Added agemini
configuration entry, including API token and model specification, while omitting the URL as per client library behavior.src/gemini_settings.ts
: Introduced aGeminiSettings
interface for type-safe Gemini configurations.src/llm_client.ts
:OpenAiSettings | GeminiSettings
.url
presence for OpenAI).chatCompletionStreaming
for Gemini, precisely mimicking the provided gold standard example (usinggetGenerativeModel
,startChat
, andsendMessageStream
).src/llm_proxy.ts
:transformGenerator
andformatChunk
functions to handle stream data from both OpenAI and Gemini.getLlmClient
to correctly instantiateLlmClient
with eitherOpenAiSettings
orGeminiSettings
.OpenAiServerSettings | Record<string, GeminiSettings>
.src/app_settings.ts
:getOpenAiServerSettings
withgetAllServerSettings
.getAllServerSettings
to correctly parse and differentiate between OpenAI and Gemini configurations fromopenai_servers.yaml
based on the presence of theurl
key.src/index.ts
:getAllServerSettings
function.src/tests/index.test.ts
:getAllServerSettings
.package.json
:@google/generative-ai
andweb-streams-polyfill
to dependencies.build
script inpackage.json
to mark@google/genai
as an external dependency foresbuild
.package-lock.json
: Reflects new dependencies.How to Test
openai_servers.yaml
has a validgemini
entry with an API key and model./gemini/v1/chat/completions
endpoint.npm test
to ensure all unit and integration tests pass.This implementation fulfills the objective of integrating Gemini support while maintaining existing functionalities.