Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request - Choice to power from Local LLM server #11

Closed
TSM-EVO opened this issue May 13, 2024 · 1 comment
Closed

Feature Request - Choice to power from Local LLM server #11

TSM-EVO opened this issue May 13, 2024 · 1 comment

Comments

@TSM-EVO
Copy link

TSM-EVO commented May 13, 2024

It would be very handy to have the ability to connect to a local LLM server like OLLAMA, to have a truly local solution capable of generating these agents.

@jgravelle
Copy link
Owner

jgravelle commented May 13, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants