New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Request to run local LLMs instead of paid API #1131
Comments
You can try using a project like litellm which can connect to ollama, and be used as a backend for Open Source models. I will test it soon and then write a guide. |
That would be great. Thanks! |
If it helps, LiteLLM mentions running LibreChat and ollama in the below page |
I think it's pretty misleading to call this project "LibreChat" when it's relying on proprietary, paid APIs |
Libre in software means to have "very few limitations on distribution or the right to access the source code to create improved versions." This project is completely open-source, with the least limited license for use, and utilizes open-source solutions and philosophies, to improve upon its origin for those who build and use it. The fact that this project has been downloaded over 650,000 times from github containers, not including non-container downloads, without transacting a single dollar for this, lays claim to the term To be clear, it's not relying on proprietary, paid APIs as you said, rather building mainly on a framework that OpenAI has set as the industry standard for interacting with LLMs remotely. LibreChat can utilize countless other open-source solutions to use local large language models following this framework, which is why it's useful to have a standard. This is not limited to LiteLLM, but also to many other API services that integrate local LLM solutions. @nightpool This project lays more claim to being open-source than even Meta's LLama models, and frankly your comment undermines the open-source efforts to provide it freely. |
I'm going to close this in favor of #1344 It's not like you can't run local LLMs using other tools in combination with LibreChat. Right now, this isn't a huge priority but it's planned to have a dedicated 'reverse proxy' endpoint to serve any alternative endpoint need, provided the OpenAI spec is followed, which even Mistral API follows. |
Contact Details
No response
What features would you like to see added?
There are free open-source databases you can run at home now and they will only get better.
Installing has been made easy via sites like https://ollama.ai/
Please add support to connect to local database via Ollama.
More details
There is a great need for people be be able to run their own open source LLM at home.
Which components are impacted by your request?
No response
Pictures
No response
Code of Conduct
The text was updated successfully, but these errors were encountered: