Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OLLAMA forced to use local process and has not API override[ISSUE] #569

Open
Mookins opened this issue May 18, 2024 · 3 comments
Open

OLLAMA forced to use local process and has not API override[ISSUE] #569

Mookins opened this issue May 18, 2024 · 3 comments

Comments

@Mookins
Copy link

Mookins commented May 18, 2024

Ollama running remotely via API should NEVER have the API sitting exposed it's a security risk, why wouldn't you have the same options in the settings panel to set it manually?

@cafeTechne
Copy link

cafeTechne commented May 18, 2024 via email

@Mookins
Copy link
Author

Mookins commented May 18, 2024

Thats the most ridiculous statement, force local use only so anyone who runs a separate machine has to put extra time in when it could just work, its not even like adding it would have been a sinkhole, the options are there are were built for the other endpoints, openAi compatible endpoints exists it would have been less work to just code it with this in mind.

and finally just because something isn't primarily focused on security doesn't mean it should just ignore it... There is a reason many projects in this space are adding security features in... but by all means you keep running your public facing APIs with no security.

@Mookins
Copy link
Author

Mookins commented May 18, 2024

Its literally just allowing an API key input box.... Im still baffled by the stupid comment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants