LocalLLM?
#100
Replies: 1 comment
-
Hey we are modifying the underlying engine that will enable you to plug in your own local servers. You can track the progress in #101. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
is there a way to set a local hosted LLM (or multiple like stable, whisper, tortis, a LLaMa)?
Beta Was this translation helpful? Give feedback.
All reactions