Language Implementation
Feature Type
🚀 The feature, motivation and pitch
The create-onchain-agent templates currently hardcode OpenAI as the LLM provider. The generated create-agent.ts uses import { openai } from "@ai-sdk/openai" and calls openai.chat("gpt-4o-mini") directly, and the generated .env.local only includes OPENAI_API_KEY=.
This means a developer using any other provider (Anthropic, Google, OpenRouter, Groq, Ollama, etc.) has to open create-agent.ts and rewrite the model initialization before they can even run the agent.
This is unnecessary because @ai-sdk/openai and @langchain/openai — the packages already installed in the template — natively support connecting to any OpenAI-compatible API via their baseURL option. And all major LLM providers now expose OpenAI-compatible endpoints (Anthropic, Google Gemini, OpenRouter, Groq, Together AI, Ollama).
Proposal: Replace the hardcoded openai("gpt-4o-mini") singleton with createOpenAI({ apiKey, baseURL }) configured via environment variables:
AI_API_KEY — API key for any provider (falls back to OPENAI_API_KEY)
AI_BASE_URL — provider's endpoint (optional, defaults to OpenAI)
AI_MODEL — model identifier (optional, defaults to gpt-4o-mini)
The generated .env.local would include inline examples for all major providers, so a developer with an Anthropic key can just set two env vars and run npm run dev — no code changes needed.
This change:
- Adds zero new dependencies (uses
createOpenAI/ChatOpenAI already installed)
- Is fully backward-compatible (falls back to
OPENAI_API_KEY if AI_API_KEY is not set)
- Matches how the core
@coinbase/agentkit library already works: fully LLM-agnostic
Alternatives
Let developers manually edit create-agent.ts after scaffolding. This is the current approach but creates friction for anyone not using OpenAI.
Additional context
The core @coinbase/agentkit library and all framework extensions (agentkit-langchain, agentkit-vercel-ai-sdk) are already fully LLM-agnostic. The OpenAI lock-in exists only in the scaffolding layer. The Python create-onchain-agent has the same issue.
Language Implementation
Feature Type
🚀 The feature, motivation and pitch
The
create-onchain-agenttemplates currently hardcode OpenAI as the LLM provider. The generatedcreate-agent.tsusesimport { openai } from "@ai-sdk/openai"and callsopenai.chat("gpt-4o-mini")directly, and the generated.env.localonly includesOPENAI_API_KEY=.This means a developer using any other provider (Anthropic, Google, OpenRouter, Groq, Ollama, etc.) has to open
create-agent.tsand rewrite the model initialization before they can even run the agent.This is unnecessary because
@ai-sdk/openaiand@langchain/openai— the packages already installed in the template — natively support connecting to any OpenAI-compatible API via theirbaseURLoption. And all major LLM providers now expose OpenAI-compatible endpoints (Anthropic, Google Gemini, OpenRouter, Groq, Together AI, Ollama).Proposal: Replace the hardcoded
openai("gpt-4o-mini")singleton withcreateOpenAI({ apiKey, baseURL })configured via environment variables:AI_API_KEY— API key for any provider (falls back toOPENAI_API_KEY)AI_BASE_URL— provider's endpoint (optional, defaults to OpenAI)AI_MODEL— model identifier (optional, defaults togpt-4o-mini)The generated
.env.localwould include inline examples for all major providers, so a developer with an Anthropic key can just set two env vars and runnpm run dev— no code changes needed.This change:
createOpenAI/ChatOpenAIalready installed)OPENAI_API_KEYifAI_API_KEYis not set)@coinbase/agentkitlibrary already works: fully LLM-agnosticAlternatives
Let developers manually edit
create-agent.tsafter scaffolding. This is the current approach but creates friction for anyone not using OpenAI.Additional context
The core
@coinbase/agentkitlibrary and all framework extensions (agentkit-langchain,agentkit-vercel-ai-sdk) are already fully LLM-agnostic. The OpenAI lock-in exists only in the scaffolding layer. The Pythoncreate-onchain-agenthas the same issue.