Skip to content

Conversation

intellectronica
Copy link

@intellectronica intellectronica commented Apr 4, 2024

Adds an option -o base_url https://... to override the endpoint used for Command-R.

This is useful when using the model through a cloud provider or your own deployment (tested this with a deployment on Azure - after deploying Command-R+ through the Azure AI Studio I have an endpoint: https://....swedencentral.inference.ai.azure.com/v1).

Ultimately I think this is something that can be managed by llm in a standard way, like keys, since most popular models are now available from various cloud providers or as your own deployment (if the weights are available). If you think this is interesting let me know and I don't mind contributing that.

@simonw
Copy link
Owner

simonw commented Mar 28, 2025

This one feels a bit weird to me as a model option, mainly because none of the other LLM plugins have used options in this way.

I'm going to implement this as an environment variable instead.

simonw added a commit that referenced this pull request Mar 28, 2025
@simonw simonw closed this Mar 28, 2025
simonw added a commit that referenced this pull request Mar 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants