Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Google Gemini and Local LLM Support #308

Closed
sumant1122 opened this issue May 23, 2024 · 5 comments
Closed

Feature: Google Gemini and Local LLM Support #308

sumant1122 opened this issue May 23, 2024 · 5 comments

Comments

@sumant1122
Copy link

I see that OpenAI is being used for LLM tasks, but not everyone has access to OpenAI paid account. Google Gemini support would be great. A function to connect to local LLM would be great too.

@cyyeh
Copy link
Member

cyyeh commented May 23, 2024

@sumant1122 welcome to check out this issue: #277

@cyyeh
Copy link
Member

cyyeh commented May 23, 2024

Since we are currently working hard on improving end-to-end user experience. We may not at the moment have enough time to support other LLMs and do prompt engineering to other LLMs also. If you have thoughts on how to make the process easily scalable, we definitely welcome your feedback and contribution :)

For customizing your LLM or Document Store, please also refer to the guide: https://docs.getwren.ai/installation/custom_llm

Thank you

@sumant1122
Copy link
Author

Thank you. Will check out the docs and try to contribute.

@cyyeh
Copy link
Member

cyyeh commented Jun 11, 2024

all, the ollama has been integrated in this branch, also you can use openai api compatible llm: chore/ai-service/update-env
we'll merge this branch to the main branch in the near future and update the documentation
as of now, I'll delete the original ollama branch
Thank you all for your patience

related pr: #376

@cyyeh
Copy link
Member

cyyeh commented Jun 28, 2024

All, we now support using Ollama and OpenAI API-compatible LLMs now with the latest release: https://github.com/Canner/WrenAI/releases/tag/0.6.0

Setups on how to run Wren AI using custom LLMs: https://docs.getwren.ai/installation/custom_llm#running-wren-ai-with-your-custom-llm-or-document-store

Currently, there is one obvious limitation for custom LLMs: you need to use the same provder(such as OpenAI, or Ollama) for LLM and embedding model. We'll fix that and release a new version soon. Stay tuned 🙂

I'll close this issue as completed now.

@cyyeh cyyeh closed this as completed Jun 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants