Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Ollama API #4

Open
imbev opened this issue Jan 14, 2024 · 3 comments
Open

Support for Ollama API #4

imbev opened this issue Jan 14, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@imbev
Copy link

imbev commented Jan 14, 2024

Ollama is an open source application that makes it very easy to use LLMs via CLI or an HTTP API. I suggest adding support for Ollama's API.

https://github.com/jmorganca/ollama/blob/main/docs/api.md

@pymike00 pymike00 added the enhancement New feature or request label Jan 14, 2024
@pymike00
Copy link
Owner

Hi there @imbev !

Thanks for your input!! Yes, support for local models is definitely something that I would like to implement in future releases, right after chat history.

Regarding Ollama API is that still WSL only under Windows? What system do you use it on?

Cheers!!

@imbev
Copy link
Author

imbev commented Jan 15, 2024

Hello @pymike00

I currently use Ollama on Debian Linux.

It can be compiled on Windows, but the Windows version is definitely not ready yet. https://github.com/jmorganca/ollama/blob/main/docs/development.md

@pymike00
Copy link
Owner

pymike00 commented Jan 15, 2024

I think in that case it's probably better to wait for proper Windows support before adding some code for it.

After all the main value of the project lies in its simplicity, I think.

Thank you for your feedback, it's much appreciated.

Feel free to post any other suggestion you may have.

Happy Coding!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants