-
-
Notifications
You must be signed in to change notification settings - Fork 14.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add nextjs ollama llm UI frontend for Ollama #313146
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,87 @@ | ||
{ | ||
config, | ||
pkgs, | ||
lib, | ||
... | ||
}: | ||
let | ||
cfg = config.services.nextjs-ollama-llm-ui; | ||
# we have to override the URL to a Ollama service here, because it gets baked into the web app. | ||
nextjs-ollama-llm-ui = cfg.package.override { ollamaUrl = "https://ollama.lambdablob.com"; }; | ||
in | ||
{ | ||
options = { | ||
services.nextjs-ollama-llm-ui = { | ||
enable = lib.mkEnableOption '' | ||
Simple Ollama web UI service; an easy to use web frontend for a Ollama backend service. | ||
Run state-of-the-art AI large language models (LLM) similar to ChatGPT locally with privacy | ||
on your personal computer. | ||
This service is stateless and doesn't store any data on the server; all data is kept | ||
locally in your web browser. | ||
See https://github.com/jakobhoeg/nextjs-ollama-llm-ui. | ||
|
||
Required: You need the Ollama backend service running by having | ||
"services.nextjs-ollama-llm-ui.ollamaUrl" point to the correct url. | ||
You can host such a backend service with NixOS through "services.ollama". | ||
''; | ||
package = lib.mkPackageOption pkgs "nextjs-ollama-llm-ui" { }; | ||
|
||
hostname = lib.mkOption { | ||
type = lib.types.str; | ||
default = "127.0.0.1"; | ||
example = "ui.example.org"; | ||
description = '' | ||
The hostname under which the Ollama UI interface should be accessible. | ||
By default it uses localhost/127.0.0.1 to be accessible only from the local machine. | ||
Change to "0.0.0.0" to make it directly accessible from the local network. | ||
|
||
Note: You should keep it at 127.0.0.1 and only serve to the local | ||
network or internet from a (home) server behind a reverse-proxy and secured encryption. | ||
See https://wiki.nixos.org/wiki/Nginx for instructions on how to set up a reverse-proxy. | ||
''; | ||
}; | ||
|
||
port = lib.mkOption { | ||
type = lib.types.port; | ||
default = 3000; | ||
example = 3000; | ||
Comment on lines
+46
to
+47
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This by default conflicts with a lot of services: https://github.com/search?q=repo%3ANixOS%2Fnixpkgs+default+%3D+3000%3B&type=code For example for an onion service setup when trying to run multiple of services that provide port 3000 it will require doing a port management that i would think would be better if it was handled by nix through probably doing a unique ports per services to make the setup more functional? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. We would need something like systemd's "DynamicUser" feature that just picks one new, unique one, but for ports. For NixOS modules i don't care about the port if my servicce sits behind a nginx reverse-proxy. Do you know about such a feature? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Systemd doesn't seem to have this feature. Should we file a feature request for it?
It's a minor annoyance in both scenarios 🤔 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. True. Although now that i think about it, i don't believe it would work if systemd picks a random port, because to configure e.g. nginx as a reverse-proxy, you need to determine the port to forward to at Nix build time (so beforehand). So, i would keep the default port for now the same as the upstream project does (for nextjs-ollama-llm-ui its port 3000) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. There was some work being done in NixOS/rfcs#151 that got closed due to a lack of intresst. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Thanks for the info. Unfortunately, i don't have enough time to move this forward. |
||
description = '' | ||
The port under which the Ollama UI interface should be accessible. | ||
''; | ||
}; | ||
|
||
ollamaUrl = lib.mkOption { | ||
type = lib.types.str; | ||
default = "127.0.0.1:11434"; | ||
example = "https://ollama.example.org"; | ||
description = '' | ||
The address (including host and port) under which we can access the Ollama backend server. | ||
!Note that if the the UI service is running under a domain "https://ui.example.org", | ||
the Ollama backend service must allow "CORS" requests from this domain, e.g. by adding | ||
"services.ollama.environment.OLLAMA_ORIGINS = [ ... "https://ui.example.org" ];"! | ||
''; | ||
}; | ||
}; | ||
}; | ||
|
||
config = lib.mkIf cfg.enable { | ||
systemd.services = { | ||
|
||
nextjs-ollama-llm-ui = { | ||
wantedBy = [ "multi-user.target" ]; | ||
description = "Nextjs Ollama LLM Ui."; | ||
after = [ "network.target" ]; | ||
environment = { | ||
HOSTNAME = cfg.hostname; | ||
PORT = toString cfg.port; | ||
NEXT_PUBLIC_OLLAMA_URL = cfg.ollamaUrl; | ||
}; | ||
serviceConfig = { | ||
ExecStart = "${lib.getExe nextjs-ollama-llm-ui}"; | ||
DynamicUser = true; | ||
}; | ||
}; | ||
}; | ||
}; | ||
meta.maintainers = with lib.maintainers; [ malteneuss ]; | ||
} |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,22 @@ | ||
{ lib, ... }: | ||
|
||
{ | ||
name = "nextjs-ollama-llm-ui"; | ||
meta.maintainers = with lib.maintainers; [ malteneuss ]; | ||
|
||
nodes.machine = | ||
{ pkgs, ... }: | ||
{ | ||
services.nextjs-ollama-llm-ui = { | ||
enable = true; | ||
port = 8080; | ||
}; | ||
}; | ||
|
||
testScript = '' | ||
# Ensure the service is started and reachable | ||
machine.wait_for_unit("nextjs-ollama-llm-ui.service") | ||
machine.wait_for_open_port(8080) | ||
machine.succeed("curl --fail http://127.0.0.1:8080") | ||
''; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't this be using the user's chosen
ollamaUrl
?I can't get it working unless I override the package manually.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, this slipped through. It will be fixed with the next release #315184