Describe the bug
I tried to setup copilot-cli to use my z.ai coding plan. Here is my config (in .zshrc):
export COPILOT_PROVIDER_BASE_URL=https://api.z.ai/api/coding/paas/v4
export COPILOT_PROVIDER_API_KEY=MY-Z.AI-API-KEY
export COPILOT_MODEL=GLM-5.1
and got this error:
Model "gem-5.1" is not in the built-in catalog. Using defaults for: prompt tokens (COPILOT_PROVIDER_MAX_PROMPT_TOKENS), output tokens
(COPILOT_PROVIDER_MAX_OUTPUT_TOKENS). Run copilot help providers for configuration details.
Then I tried local Ollama:
export COPILOT_PROVIDER_BASE_URL=http://localhost:11434
export COPILOT MODEL=gemma3:4b
and got the same message:
Model "gemma3:4b" is not in the built-in catalog. Using defaults for: prompt tokens (COPILOT_PROVIDER_MAX_PROMPT_TOKENS), output tokens
(COPILOT_PROVIDER_MAX_OUTPUT_TOKENS). Run copilot help providers for configuration details.
Copilot CLI does not want to run with either of these configs.
Affected version
GitHub Copilot CLI 1.0.23.
Steps to reproduce the behavior
described above.
Expected behavior
to be able to use BYOK models
Additional context
No response
Describe the bug
I tried to setup copilot-cli to use my z.ai coding plan. Here is my config (in .zshrc):
export COPILOT_PROVIDER_BASE_URL=https://api.z.ai/api/coding/paas/v4export COPILOT_PROVIDER_API_KEY=MY-Z.AI-API-KEYexport COPILOT_MODEL=GLM-5.1and got this error:
Model "gem-5.1" is not in the built-in catalog. Using defaults for: prompt tokens (COPILOT_PROVIDER_MAX_PROMPT_TOKENS), output tokens(COPILOT_PROVIDER_MAX_OUTPUT_TOKENS). Run copilot help providers for configuration details.Then I tried local Ollama:
export COPILOT_PROVIDER_BASE_URL=http://localhost:11434export COPILOT MODEL=gemma3:4band got the same message:
Model "gemma3:4b" is not in the built-in catalog. Using defaults for: prompt tokens (COPILOT_PROVIDER_MAX_PROMPT_TOKENS), output tokens(COPILOT_PROVIDER_MAX_OUTPUT_TOKENS). Run copilot help providers for configuration details.Copilot CLI does not want to run with either of these configs.
Affected version
GitHub Copilot CLI 1.0.23.
Steps to reproduce the behavior
described above.
Expected behavior
to be able to use BYOK models
Additional context
No response