Skip to content

Conversation

@shivammittal274
Copy link
Contributor

No description provided.

@greptile-apps
Copy link
Contributor

greptile-apps bot commented Nov 25, 2025

Greptile Overview

Greptile Summary

This PR adds a comprehensive Vercel AI SDK adapter that enables BrowserOS to use multiple LLM providers (OpenAI, Anthropic, Google, Azure, Bedrock, OpenRouter, Ollama, LMStudio) through a unified interface. The implementation uses a clean strategy pattern for converting between Gemini and Vercel AI SDK formats.

Key additions:

  • VercelAIContentGenerator class implementing the ContentGenerator interface
  • Three conversion strategies: Tool, Message, and Response conversions
  • Comprehensive test coverage (1,838 lines across 3 test files)
  • Type-safe conversions with Zod validation schemas
  • Streaming support with SSE integration for real-time responses
  • Support for text, images, and tool interactions

Issues found:

  • Tool schema normalization forces all parameters to be required, which will break tools with optional parameters (line 40-46 in tool.ts)

Confidence Score: 4/5

  • Safe to merge with one logic issue that should be fixed to prevent tool execution failures
  • The code is well-architected with comprehensive test coverage and follows best practices. However, the OpenAI schema normalization makes all tool parameters required, which will cause runtime failures when tools have optional parameters. This is a critical issue for tool execution but doesn't affect the core adapter functionality.
  • Pay attention to packages/agent/src/agent/gemini-vercel-sdk-adapter/strategies/tool.ts - the normalizeForOpenAI method needs to be fixed before tools with optional parameters are used

Important Files Changed

File Analysis

Filename Score Overview
packages/agent/src/agent/gemini-vercel-sdk-adapter/index.ts 5/5 Adds main adapter class implementing ContentGenerator interface with multi-provider support (Anthropic, OpenAI, Google, etc.). Clean architecture with strategy pattern for conversions.
packages/agent/src/agent/gemini-vercel-sdk-adapter/strategies/tool.ts 4/5 Tool conversion with OpenAI strict mode normalization. Makes all properties required which may cause issues with optional parameters.
packages/agent/src/agent/gemini-vercel-sdk-adapter/strategies/message.ts 5/5 Message conversion handling text, images, and tool interactions. Properly handles duplicate tool results and edge cases.
packages/agent/src/agent/gemini-vercel-sdk-adapter/strategies/response.ts 5/5 Response conversion for both streaming and non-streaming. Includes SSE support and proper error handling for stream chunks.
packages/agent/package.json 5/5 Adds required dependencies for Vercel AI SDK v5 and multiple provider SDKs (Anthropic, OpenAI, Google, Azure, Bedrock).

Sequence Diagram

sequenceDiagram
    participant Client as Client/CLI
    participant Adapter as VercelAIContentGenerator
    participant MsgStrategy as MessageConversionStrategy
    participant ToolStrategy as ToolConversionStrategy
    participant RespStrategy as ResponseConversionStrategy
    participant Provider as AI Provider (OpenAI/Anthropic/etc)
    participant Stream as HonoSSEStream

    Client->>Adapter: generateContentStream(request)
    Adapter->>MsgStrategy: geminiToVercel(contents)
    MsgStrategy-->>Adapter: CoreMessage[]
    Adapter->>ToolStrategy: geminiToVercel(tools)
    ToolStrategy->>ToolStrategy: normalizeForOpenAI(schema)
    ToolStrategy-->>Adapter: VercelTool[]
    Adapter->>Provider: streamText({messages, tools, system})
    Provider-->>Adapter: fullStream (AsyncIterable)
    
    loop For each stream chunk
        Provider->>RespStrategy: chunk (text-delta/tool-call/finish)
        alt text-delta chunk
            RespStrategy->>Stream: write SSE (text-delta)
            RespStrategy-->>Adapter: yield GenerateContentResponse
        else tool-call chunk
            RespStrategy->>Stream: write SSE (tool-call)
            RespStrategy->>RespStrategy: accumulate in toolCallsMap
        else finish chunk
            RespStrategy->>RespStrategy: store finishReason
        end
    end
    
    RespStrategy->>Provider: getUsage()
    Provider-->>RespStrategy: usage metadata
    RespStrategy->>Stream: write SSE (finish event)
    RespStrategy->>ToolStrategy: vercelToGemini(toolCalls)
    ToolStrategy-->>RespStrategy: FunctionCall[]
    RespStrategy-->>Adapter: yield final GenerateContentResponse
    Adapter-->>Client: AsyncGenerator<GenerateContentResponse>
Loading

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

14 files reviewed, 1 comment

Edit Code Review Agent Settings | Greptile

Comment on lines 40 to 46
const propertyKeys = Object.keys(result.properties);
if (propertyKeys.length > 0) {
// Merge with existing required array (if any) and ensure all keys are included
const existingRequired = Array.isArray(result.required) ? result.required : [];
const allRequired = Array.from(new Set([...existingRequired, ...propertyKeys]));
result.required = allRequired;
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

logic: making all properties required may break tools with optional parameters

the normalization adds all property keys to required array, making every parameter mandatory. tools with genuinely optional params will fail if those params aren't provided.

Suggested change
const propertyKeys = Object.keys(result.properties);
if (propertyKeys.length > 0) {
// Merge with existing required array (if any) and ensure all keys are included
const existingRequired = Array.isArray(result.required) ? result.required : [];
const allRequired = Array.from(new Set([...existingRequired, ...propertyKeys]));
result.required = allRequired;
}
// 2. Add required array with ALL property keys
if (result.properties && typeof result.properties === 'object') {
const propertyKeys = Object.keys(result.properties);
if (propertyKeys.length > 0) {
// Only merge with existing required array, don't force all properties to be required
if (Array.isArray(result.required) && result.required.length > 0) {
result.required = Array.from(new Set(result.required));
}
}
}
Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/agent/src/agent/gemini-vercel-sdk-adapter/strategies/tool.ts
Line: 40:46

Comment:
**logic:** making all properties required may break tools with optional parameters

the normalization adds all property keys to `required` array, making every parameter mandatory. tools with genuinely optional params will fail if those params aren't provided.

```suggestion
      // 2. Add required array with ALL property keys
      if (result.properties && typeof result.properties === 'object') {
        const propertyKeys = Object.keys(result.properties);
        if (propertyKeys.length > 0) {
          // Only merge with existing required array, don't force all properties to be required
          if (Array.isArray(result.required) && result.required.length > 0) {
            result.required = Array.from(new Set(result.required));
          }
        }
      }
```

How can I resolve this? If you propose a fix, please make it concise.

/**
* Configuration for Vercel AI adapter
*/
export interface VercelAIConfig {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@greptileai what is a better way to have this structure? Can we have anthropic, openai, google as enums? and have a different structure for the config?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking at the current VercelAIConfig interface, I can suggest a better structure that uses enums for providers and organizes the configuration more logically:

Suggested change
export interface VercelAIConfig {
/**
* Supported AI providers
*/
export enum AIProvider {
ANTHROPIC = 'anthropic',
OPENAI = 'openai',
GOOGLE = 'google',
OPENROUTER = 'openrouter',
AZURE = 'azure',
OLLAMA = 'ollama',
LMSTUDIO = 'lmstudio',
AWS_BEDROCK = 'aws-bedrock',
}
/**
* Provider-specific configuration
*/
export interface ProviderConfig {
apiKey?: string;
baseUrl?: string;
region?: string;
resourceName?: string;
accessKeyId?: string;
secretAccessKey?: string;
sessionToken?: string;
}
/**
* Configuration for Vercel AI adapter
*/
export interface VercelAIConfig {
model: string;
provider: AIProvider;
providerConfig?: ProviderConfig;
honoStream?: HonoSSEStream;
}

This structure provides several benefits:

  1. Type safety: Using enums prevents typos and provides autocomplete
  2. Cleaner organization: All provider-specific settings are grouped under providerConfig
  3. Extensibility: Easy to add new providers without cluttering the main interface
  4. Explicit provider selection: Forces users to specify which provider they're using
  5. Flexible configuration: The ProviderConfig can accommodate different providers' needs without optional fields for every possible provider

This way, instead of having separate optional fields for each provider's configuration scattered in the main interface, everything is properly organized and type-safe.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@shivammittal274 could you update the code to use something similar to above design?

@felarof99
Copy link
Contributor

LGTM!

@felarof99 felarof99 merged commit 9cf99b9 into main Nov 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants