You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Would it be possible to add a variable to target a different OpenAI API Compatible Endpoint?
Basically, there are numerous ways to host private models that currently support all of the OpenAI API endpoints, so we could chat with private LLMs and not use 3rd party services.
Maybe something like adding OPENAI_API_ENDPOINT="HTTP://localhost:11434" in the .env file.
Thanks!
The text was updated successfully, but these errors were encountered:
Would it be possible to add a variable to target a different OpenAI API Compatible Endpoint?
Basically, there are numerous ways to host private models that currently support all of the OpenAI API endpoints, so we could chat with private LLMs and not use 3rd party services.
Maybe something like adding
OPENAI_API_ENDPOINT="HTTP://localhost:11434"
in the .env file.Thanks!
The text was updated successfully, but these errors were encountered: