Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open source large language model support #86

Closed
andzejsp opened this issue Dec 6, 2023 · 14 comments
Closed

Open source large language model support #86

andzejsp opened this issue Dec 6, 2023 · 14 comments

Comments

@andzejsp
Copy link

andzejsp commented Dec 6, 2023

Is it possible to run this and point it not at OpenAI but to self hosted large language model to do the thing?

@joshbickett
Copy link
Contributor

@andzejsp this would be possible. It would likely require some slight changes in prompting and some adjustments in functions to the repo. If someone finds a good provider hosting a model that has a good API and key access then go ahead and add it to the project as a PR

@orkutmuratyilmaz
Copy link

@joshbickett would you consider starting with Ollama?

I created a feature request like this: #35

@joshbickett
Copy link
Contributor

Yeah, if someone could get a PR of a vision model working locally on the project that'd be great I think

@Andy1996247
Copy link

Yeah, if someone could get a PR of a vision model working locally on the project that'd be great I think

Would this work? https://llava-vl.github.io/

https://simonwillison.net/2023/Nov/29/llamafile/

@joshbickett
Copy link
Contributor

@Andy1996247 it sounds like it may work based on what you mentioned in #101

@norzog
Copy link

norzog commented Dec 15, 2023

@joshbickett
Copy link
Contributor

Not very familiar with Petals Chat. It may work, but I think Llama.cpp is most promising

@joshbickett
Copy link
Contributor

joshbickett commented Dec 19, 2023

@Andy1996247 @orkutmuratyilmaz @norzog wanted to mention that we added support for the Gemini model in case you're interested. Was merged with PR #110

@orkutmuratyilmaz
Copy link

@joshbickett thanks for the update. We're one step closer to open source LLM support🤘🏻

@andzejsp
Copy link
Author

would be cool if ollama was supported https://github.com/jmorganca/ollama
Simply pointing to ollama instance, and bobs your uncle. Not sure how it all works, but ollama was no pain to set up and very usable, just one line of command to set it up :).

@michaelhhogue
Copy link
Collaborator

would be cool if ollama was supported https://github.com/jmorganca/ollama

Simply pointing to ollama instance, and bobs your uncle. Not sure how it all works, but ollama was no pain to set up and very usable, just one line of command to set it up :).

Currently working on LLaVA support through ollama as we speak :)

Obviously accuracy will be low, but I think it'll be great to finally have support for an open sourced model!

@bpshaver
Copy link

Heads up, I think you should be able to stand up your own OpenAI-compatible API here:

https://llama-cpp-python.readthedocs.io/en/latest/server/#multimodal-models

Then this project can point to your self-hosted API instead of OpenAI.

@joshbickett
Copy link
Contributor

We now have LLaVa available in the project thanks to a PR from @michaelhhogue!

@orkutmuratyilmaz
Copy link

thanks for the LlaVA support:)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants