Description
Check for existing issues
- Completed
Describe the bug / provide steps to reproduce it
Summary: Attempting to override the default_model
does not apply when using the openai
provider - it continues to attempt to set the model as gpt-3.5-turbo
.
Repro:
- Override the
default_model
for the OpenAI provider - Override the
api_url
- Set an API key
- Observe that when using the Assistant panel that requests fail
- Debug request logs and observe that the model being passed is the default OpenAI
gpt-3.5-turbo
model instead of the"@cf/meta/llama-3.2-3b-instruct"
model set inassistant.default_model.model
insettings.json
.
Note: I work at Cloudflare and thus was able to see the request our API infra accepted (and rejected due to the model mismatch).
I can see that the model should be passed per https://github.com/zed-industries/zed/blob/main/crates/open_ai/src/open_ai.rs#L159 and https://github.com/zed-industries/zed/blob/main/crates/open_ai/src/open_ai.rs#L306-L312 but can't see where it the settings override is getting reset/ignored.
Relevant settings.json
"assistant": {
"dock": "right",
"enabled": true,
"version": "2",
"default_model": {
"provider": "openai",
"model": "@cf/meta/llama-3.2-3b-instruct"
}
},
"language_models": {
"openai": {
"api_url": "https://api.cloudflare.com/client/v4/accounts/d458dbe698b8eef41837f941d73bc5b3/ai/v1"
}
}
Environment
Zed: v0.156.2 (Zed)
OS: macOS 14.7.0
Memory: 32 GiB
Architecture: aarch64
If applicable, add mockups / screenshots to help explain present your vision of the feature
Assistant error:

Model selection resets/never includes the default model I set:
