Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

External server #60

Closed
TijuanaKez opened this issue May 29, 2024 · 3 comments
Closed

External server #60

TijuanaKez opened this issue May 29, 2024 · 3 comments

Comments

@TijuanaKez
Copy link

I like the interface and colours, but I can't run Ollama efficiently on my Mac.
I do have it running on a linux box with a 3090 on my LAN though!
Can you please add a setting to choose between localhost and a LAN ip for the server connection like OpenWebUI?

@rb81
Copy link

rb81 commented Jun 22, 2024

+1 on this. I also run Ollama on a different machine so configuring the endpoint would be useful.

@reestr
Copy link

reestr commented Jul 3, 2024

This request would be a duplicate of #2

@kevinhermawan
Copy link
Owner

Hey everyone! Finally, we can close this issue now. Check out the features in the new update! 😉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants