Before submitting your bug report
Relevant environment info
- OS: Linux
- Continue version: 1.5.45
- IDE version: n/a; CLI
- Model: local
- config:
ame: Local Config
version: 1.0.0
schema: v1
models:
- name: local/large
provider: openai
model: local/large
apiKey: key
apiBase: https://myserver.lan:port/v1
Description
Neither local/large nor local-large work as a model name in the CLI config.yaml: the TUI talks about "large" being the name, and LLM response to chat input doesn't work. When I change the backend to recognise "large" as an alias, and change model to just large, it suddenly works.
Conclusion: the model field is essentially being basenamed on multiple delimiters, like / and -, instead of being respected. I think only the ai-sdk provider should have special handling of the model name.
To reproduce
No response
Log output
Before submitting your bug report
Relevant environment info
Description
Neither local/large nor local-large work as a model name in the CLI config.yaml: the TUI talks about "large" being the name, and LLM response to chat input doesn't work. When I change the backend to recognise "large" as an alias, and change model to just large, it suddenly works.
Conclusion: the model field is essentially being basenamed on multiple delimiters, like
/and-, instead of being respected. I think only the ai-sdk provider should have special handling of the model name.To reproduce
No response
Log output