Skip to content

Support alternative model names when using OpenAI-compatible local AI as backend #26

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

L-jasmine
Copy link

I found that in OpenAI’s client, you can switch servers by setting OPENAI_BASE_URL in the environment, so you can use this feature to use a local LLM server as long as it has the same format as OpenAI. However, in this case, the model may not be limited to gpt-4-turbo, so I added a model parameter to the corresponding function to select different models.

Signed-off-by: csh <458761603@qq.com>
@L-jasmine L-jasmine changed the title Allow change model Support alternative model names when using OpenAI-compatible local AI as backend Jun 24, 2024
@siddhantx0
Copy link

Wow, cool update.

Signed-off-by: csh <458761603@qq.com>
@L-jasmine
Copy link
Author

I updated the README to explain how to use this feature

@j-dominguez9 j-dominguez9 added the enhancement New feature or request label Jun 28, 2024
Signed-off-by: csh <458761603@qq.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants