Description Environment:
CodePilot v0.37.0
Windows
Provider type: Custom API (OpenAI-compatible)
Base URL: http://127.0.0.1:4000
Backend proxy: LiteLLM
Model exposed by LiteLLM: nvidia-glm47
What I expected:
After adding the provider, I should be able to select the provider/model in the conversation header.
Docs say adding a provider includes selecting a default model, and providers can be switched in the conversation header.
What actually happened:
The provider is saved and appears under "Connected Providers".
LiteLLM /chat/completions works.
LiteLLM /models returns nvidia-glm47.
But the chat model dropdown only shows Claude Code models (Sonnet/Opus/Haiku).
The Custom API edit dialog does not show any "default model" selection step.
Repro steps:
Run LiteLLM locally on http://127.0.0.1:4000
Verify /models returns nvidia-glm47
Add provider in CodePilot:
Save provider
Open a new chat and check the model picker
Additional verification:
Reactions are currently unavailable
You can’t perform that action at this time.