Skip to content

[Bug] Custom API / LiteLLM provider saves successfully but models do not appear in chat model picker on Windows v0.37.0 #296

@lhylvsea

Description

@lhylvsea

Environment:

  • CodePilot v0.37.0
  • Windows
  • Provider type: Custom API (OpenAI-compatible)
  • Base URL: http://127.0.0.1:4000
  • Backend proxy: LiteLLM
  • Model exposed by LiteLLM: nvidia-glm47

What I expected:

  • After adding the provider, I should be able to select the provider/model in the conversation header.
  • Docs say adding a provider includes selecting a default model, and providers can be switched in the conversation header.

What actually happened:

  • The provider is saved and appears under "Connected Providers".
  • LiteLLM /chat/completions works.
  • LiteLLM /models returns nvidia-glm47.
  • But the chat model dropdown only shows Claude Code models (Sonnet/Opus/Haiku).
  • The Custom API edit dialog does not show any "default model" selection step.

Repro steps:

  1. Run LiteLLM locally on http://127.0.0.1:4000
  2. Verify /models returns nvidia-glm47
  3. Add provider in CodePilot:
  4. Save provider
  5. Open a new chat and check the model picker

Additional verification:

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions