Problem
agentspec scan and agentspec generate are hardcoded to the Anthropic SDK (@anthropic-ai/sdk). Users must have ANTHROPIC_API_KEY to use the two most impressive features. This creates provider lock-in that hurts adoption — users on OpenRouter, Groq, Ollama, or Nvidia NIM can't use core features.
Blocks both launch hooks — neither GIF can be recorded without an Anthropic key, and the launch narrative shouldn't require a specific paid provider.
Proposed solution
Abstract the LLM call behind an LLMClient interface with two backends:
| Env vars set |
Backend |
Providers covered |
AGENTSPEC_LLM_API_KEY |
OpenAI SDK |
OpenRouter, Groq, Together, Ollama, Nvidia NIM, any OpenAI-compatible |
ANTHROPIC_API_KEY |
Anthropic SDK (existing) |
Anthropic, OpenRouter (Anthropic format) |
| Both set |
AGENTSPEC_LLM_* takes precedence |
— |
| Neither set |
Error with multi-provider help message |
— |
New env vars
AGENTSPEC_LLM_API_KEY — API key
AGENTSPEC_LLM_BASE_URL — endpoint (e.g., https://openrouter.ai/api/v1)
AGENTSPEC_LLM_MODEL — model ID (e.g., qwen/qwen3-235b-a22b)
Files to modify
packages/adapter-claude/src/index.ts — add LLMClient interface, OpenAI backend, env resolver
packages/adapter-claude/package.json — add openai dependency
packages/adapter-claude/src/__tests__/claude-adapter.test.ts — tests for OpenAI backend + env resolution
packages/cli/src/commands/generate.ts — update error message
packages/cli/src/commands/scan.ts — update error message
Error message when no key is set
No LLM API key configured. Set one of:
AGENTSPEC_LLM_API_KEY + AGENTSPEC_LLM_BASE_URL (any OpenAI-compatible provider)
ANTHROPIC_API_KEY (Anthropic Claude)
Free options: OpenRouter (openrouter.ai), Groq (console.groq.com)
Acceptance criteria
Problem
agentspec scanandagentspec generateare hardcoded to the Anthropic SDK (@anthropic-ai/sdk). Users must haveANTHROPIC_API_KEYto use the two most impressive features. This creates provider lock-in that hurts adoption — users on OpenRouter, Groq, Ollama, or Nvidia NIM can't use core features.Blocks both launch hooks — neither GIF can be recorded without an Anthropic key, and the launch narrative shouldn't require a specific paid provider.
Proposed solution
Abstract the LLM call behind an
LLMClientinterface with two backends:AGENTSPEC_LLM_API_KEYANTHROPIC_API_KEYAGENTSPEC_LLM_*takes precedenceNew env vars
AGENTSPEC_LLM_API_KEY— API keyAGENTSPEC_LLM_BASE_URL— endpoint (e.g.,https://openrouter.ai/api/v1)AGENTSPEC_LLM_MODEL— model ID (e.g.,qwen/qwen3-235b-a22b)Files to modify
packages/adapter-claude/src/index.ts— add LLMClient interface, OpenAI backend, env resolverpackages/adapter-claude/package.json— addopenaidependencypackages/adapter-claude/src/__tests__/claude-adapter.test.ts— tests for OpenAI backend + env resolutionpackages/cli/src/commands/generate.ts— update error messagepackages/cli/src/commands/scan.ts— update error messageError message when no key is set
Acceptance criteria
AGENTSPEC_LLM_API_KEY=... AGENTSPEC_LLM_BASE_URL=https://openrouter.ai/api/v1 agentspec generate agent.yaml --framework langgraphworksANTHROPIC_API_KEY=... agentspec generate agent.yaml --framework langgraphstill works (backward compat)