Skip to content

Add OpenAI as third LLM provider (GPT-5.5, GPT-5.4 Nano)#16

Open
thegrif wants to merge 1 commit intowillchen96:mainfrom
thegrif:main
Open

Add OpenAI as third LLM provider (GPT-5.5, GPT-5.4 Nano)#16
thegrif wants to merge 1 commit intowillchen96:mainfrom
thegrif:main

Conversation

@thegrif
Copy link
Copy Markdown

@thegrif thegrif commented May 2, 2026

Adds full OpenAI support across backend and frontend, following the existing Claude/Gemini provider pattern: streaming chat with tool use, text completion, model selection UI, API key management, and DB schema.

Adds full OpenAI support across backend and frontend, following the
existing Claude/Gemini provider pattern: streaming chat with tool use,
text completion, model selection UI, API key management, and DB schema.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@jpbreda
Copy link
Copy Markdown

jpbreda commented May 4, 2026

Try configuring with this branch https://github.com/jpbreda/mike/tree/feature/localllm-provider-support. It should accept a public OpenAI endpoint. See env.example for configuration of URL, key and model id. Track PR #20

nforum pushed a commit to nforum/mike that referenced this pull request May 7, 2026
nforum pushed a commit to nforum/mike that referenced this pull request May 7, 2026
…lchen96#16 (OpenAI)

Unified LLM provider architecture:
- openai.ts: dual client factory (OpenAI cloud + vLLM local) via baseURL
- models.ts: all 4 provider groups (LocalLLM, Anthropic, Google, OpenAI)
- userSettings.ts: DB openai key with VLLM_API_KEY env fallback
- ModelToggle.tsx: 4-group type union and GROUP_ORDER
- modelAvailability.ts: LocalLLM always available (server-configured)
- All frontend apiKeys: use profile.openaiApiKey from DB
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants