Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add nextjs ollama llm UI frontend for Ollama #313146

Merged
merged 3 commits into from
May 24, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions maintainers/maintainer-list.nix
Original file line number Diff line number Diff line change
Expand Up @@ -12415,6 +12415,11 @@
githubId = 18661391;
name = "Malte Janz";
};
malteneuss = {
github = "malteneuss";
githubId = 5301202;
name = "Malte Neuss";
};
malte-v = {
email = "nixpkgs@mal.tc";
github = "malte-v";
Expand Down
2 changes: 2 additions & 0 deletions nixos/doc/manual/release-notes/rl-2405.section.md
Original file line number Diff line number Diff line change
Expand Up @@ -137,6 +137,8 @@ The pre-existing [services.ankisyncd](#opt-services.ankisyncd.enable) has been m

- [ollama](https://ollama.ai), server for running large language models locally.

- [nextjs-ollama-llm-ui](https://github.com/jakobhoeg/nextjs-ollama-llm-ui), light-weight frontend server to chat with Ollama models through a web app.

- [ownCloud Infinite Scale Stack](https://owncloud.com/infinite-scale-4-0/), a modern and scalable rewrite of ownCloud.

- [PhotonVision](https://photonvision.org/), a free, fast, and easy-to-use computer vision solution for the FIRST庐 Robotics Competition.
Expand Down
1 change: 1 addition & 0 deletions nixos/modules/module-list.nix
Original file line number Diff line number Diff line change
Expand Up @@ -1399,6 +1399,7 @@
./services/web-apps/netbox.nix
./services/web-apps/nextcloud.nix
./services/web-apps/nextcloud-notify_push.nix
./services/web-apps/nextjs-ollama-llm-ui.nix
./services/web-apps/nexus.nix
./services/web-apps/nifi.nix
./services/web-apps/node-red.nix
Expand Down
87 changes: 87 additions & 0 deletions nixos/modules/services/web-apps/nextjs-ollama-llm-ui.nix
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
{
config,
pkgs,
lib,
...
}:
let
cfg = config.services.nextjs-ollama-llm-ui;
# we have to override the URL to a Ollama service here, because it gets baked into the web app.
nextjs-ollama-llm-ui = cfg.package.override { ollamaUrl = "https://ollama.lambdablob.com"; };
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't this be using the user's chosen ollamaUrl?

I can't get it working unless I override the package manually.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, this slipped through. It will be fixed with the next release #315184

in
{
options = {
services.nextjs-ollama-llm-ui = {
enable = lib.mkEnableOption ''
Simple Ollama web UI service; an easy to use web frontend for a Ollama backend service.
Run state-of-the-art AI large language models (LLM) similar to ChatGPT locally with privacy
on your personal computer.
This service is stateless and doesn't store any data on the server; all data is kept
locally in your web browser.
See https://github.com/jakobhoeg/nextjs-ollama-llm-ui.

Required: You need the Ollama backend service running by having
"services.nextjs-ollama-llm-ui.ollamaUrl" point to the correct url.
You can host such a backend service with NixOS through "services.ollama".
'';
package = lib.mkPackageOption pkgs "nextjs-ollama-llm-ui" { };

hostname = lib.mkOption {
type = lib.types.str;
default = "127.0.0.1";
example = "ui.example.org";
description = ''
The hostname under which the Ollama UI interface should be accessible.
By default it uses localhost/127.0.0.1 to be accessible only from the local machine.
Change to "0.0.0.0" to make it directly accessible from the local network.

Note: You should keep it at 127.0.0.1 and only serve to the local
network or internet from a (home) server behind a reverse-proxy and secured encryption.
See https://wiki.nixos.org/wiki/Nginx for instructions on how to set up a reverse-proxy.
'';
};

port = lib.mkOption {
type = lib.types.port;
default = 3000;
example = 3000;
Comment on lines +46 to +47
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This by default conflicts with a lot of services:

https://github.com/search?q=repo%3ANixOS%2Fnixpkgs+default+%3D+3000%3B&type=code

For example for an onion service setup when trying to run multiple of services that provide port 3000 it will require doing a port management that i would think would be better if it was handled by nix through probably doing a unique ports per services to make the setup more functional?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We would need something like systemd's "DynamicUser" feature that just picks one new, unique one, but for ports. For NixOS modules i don't care about the port if my servicce sits behind a nginx reverse-proxy. Do you know about such a feature?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We would need something like systemd's "DynamicUser" feature that just picks one new, unique one, but for ports. [...] Do you know about such a feature? -- @malteneuss (#313146 (comment))

Systemd doesn't seem to have this feature. Should we file a feature request for it?

For NixOS modules i don't care about the port if my servicce sits behind a nginx reverse-proxy. -- @malteneuss (#313146 (comment))

It's a minor annoyance in both scenarios 馃

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True. Although now that i think about it, i don't believe it would work if systemd picks a random port, because to configure e.g. nginx as a reverse-proxy, you need to determine the port to forward to at Nix build time (so beforehand). So, i would keep the default port for now the same as the upstream project does (for nextjs-ollama-llm-ui its port 3000)

Copy link
Member

@jopejoe1 jopejoe1 May 28, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There was some work being done in NixOS/rfcs#151 that got closed due to a lack of intresst.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the info. Unfortunately, i don't have enough time to move this forward.

description = ''
The port under which the Ollama UI interface should be accessible.
'';
};

ollamaUrl = lib.mkOption {
type = lib.types.str;
default = "127.0.0.1:11434";
example = "https://ollama.example.org";
description = ''
The address (including host and port) under which we can access the Ollama backend server.
!Note that if the the UI service is running under a domain "https://ui.example.org",
the Ollama backend service must allow "CORS" requests from this domain, e.g. by adding
"services.ollama.environment.OLLAMA_ORIGINS = [ ... "https://ui.example.org" ];"!
'';
};
};
};

config = lib.mkIf cfg.enable {
systemd.services = {

nextjs-ollama-llm-ui = {
wantedBy = [ "multi-user.target" ];
description = "Nextjs Ollama LLM Ui.";
after = [ "network.target" ];
environment = {
HOSTNAME = cfg.hostname;
PORT = toString cfg.port;
NEXT_PUBLIC_OLLAMA_URL = cfg.ollamaUrl;
};
serviceConfig = {
ExecStart = "${lib.getExe nextjs-ollama-llm-ui}";
DynamicUser = true;
};
};
};
};
meta.maintainers = with lib.maintainers; [ malteneuss ];
}
1 change: 1 addition & 0 deletions nixos/tests/all-tests.nix
Original file line number Diff line number Diff line change
Expand Up @@ -616,6 +616,7 @@ in {
# TODO: put in networking.nix after the test becomes more complete
networkingProxy = handleTest ./networking-proxy.nix {};
nextcloud = handleTest ./nextcloud {};
nextjs-ollama-llm-ui = runTest ./web-apps/nextjs-ollama-llm-ui.nix;
nexus = handleTest ./nexus.nix {};
# TODO: Test nfsv3 + Kerberos
nfs3 = handleTest ./nfs { version = 3; };
Expand Down
22 changes: 22 additions & 0 deletions nixos/tests/web-apps/nextjs-ollama-llm-ui.nix
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
{ lib, ... }:

{
name = "nextjs-ollama-llm-ui";
meta.maintainers = with lib.maintainers; [ malteneuss ];

nodes.machine =
{ pkgs, ... }:
{
services.nextjs-ollama-llm-ui = {
enable = true;
port = 8080;
};
};

testScript = ''
# Ensure the service is started and reachable
machine.wait_for_unit("nextjs-ollama-llm-ui.service")
machine.wait_for_open_port(8080)
machine.succeed("curl --fail http://127.0.0.1:8080")
'';
}
Loading