Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add nextjs ollama llm UI frontend for Ollama #313146

Merged
merged 3 commits into from
May 24, 2024

Conversation

malteneuss
Copy link
Contributor

@malteneuss malteneuss commented May 20, 2024

Description of changes

NixOS already has good support for the Ollama backend service. Now we can benefit from having a convenient web frontend as well for it. This is a much simpler (stateless) service with fewer dependencies than e.g. https://github.com/open-webui/open-webui as requested in #309567.

Things done

Add https://github.com/jakobhoeg/nextjs-ollama-llm-ui as a Nix package and add a corresponding NixOS module.

  • Built on platform(s)
    • x86_64-linux
    • aarch64-linux
    • x86_64-darwin
    • aarch64-darwin
  • For non-Linux: Is sandboxing enabled in nix.conf? (See Nix manual)
    • sandbox = relaxed
    • sandbox = true
  • Tested, as applicable:
  • Tested compilation of all packages that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage
  • Tested basic functionality of all binary files (usually in ./result/bin/)
  • 24.05 Release Notes (or backporting 23.05 and 23.11 Release notes)
    • (Package updates) Added a release notes entry if the change is major or breaking
    • (Module updates) Added a release notes entry if the change is significant
    • (Module addition) Added a release notes entry if adding a new NixOS module
  • Fits CONTRIBUTING.md.

You can try this module out by adding the following parts to your existing flake.nix file:

# file: flake.nix
{
   inputs.nixpkgs-nextjs-ollama-llm-ui.url = "github:malteneuss/nixpkgs/add-nextjs-ollama-llm-ui";
  ...
  outputs =
    inputs@{ self
    , nixpkgs-nextjs-ollama-llm-ui
    , ...
    }:
   { 
    nixosConfigurations.elite = nixpkgs.lib.nixosSystem {
        modules = [
          {
            imports = [
              "${nixpkgs-nextjs-ollama-llm-ui}/nixos/modules/services/web-apps/nextjs-ollama-llm-ui.nix"
            ];
            services.nextjs-ollama-llm-ui.
            services.nextjs-ollama-llm-ui = {
              enable = true;
              host = "127.0.0.1";
              port = 11435;
              ollamaUrl = "127.0.0.1:11434";
              package = nixpkgs-nextjs-ollama-llm-ui.legacyPackages.${system}.pkgs.nextjs-ollama-llm-ui;
           };
          }
         ...
        ];
      };
   };
}

and visiting "127.0.0.1:11435" in your browser.

Add a 馃憤 reaction to pull requests you find important.

@drupol
Copy link
Contributor

drupol commented May 20, 2024

Could you make sure the CI is passing?

@malteneuss malteneuss force-pushed the add-nextjs-ollama-llm-ui branch 2 times, most recently from 570b290 to 3845bee Compare May 20, 2024 20:59
@malteneuss
Copy link
Contributor Author

Ah thanks. Some whitespace issues are fixed now.

@drupol
Copy link
Contributor

drupol commented May 20, 2024

Could you pass through nixfmt-rfc-style the new files you're introducing?

@cole-h
Copy link
Member

cole-h commented May 20, 2024

@ofborg eval

@malteneuss
Copy link
Contributor Author

@drupol Done. I didn't know there was a new formatting style other than plain nixpkgs-fmt.

Copy link
Contributor

@drupol drupol left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@drupol
Copy link
Contributor

drupol commented May 21, 2024

How about adding an entry in the release note?

@@ -125,7 +125,9 @@ Use `services.pipewire.extraConfig` or `services.pipewire.configPackages` for Pi

- [rspamd-trainer](https://gitlab.com/onlime/rspamd-trainer), script triggered by a helper which reads mails from a specific mail inbox and feeds them into rspamd for spam/ham training.

- [ollama](https://ollama.ai), server for running large language models locally.
- [ollama](https://ollama.ai), backend server for running large language models locally.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line should not be modified.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reverted

@malteneuss
Copy link
Contributor Author

Good idea. I added a release note for the 24.05 release

@malteneuss
Copy link
Contributor Author

Resolved conflicts. Do you have merge rights, or would i need to search for someone?

Copy link
Contributor

@drupol drupol left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add a simple nixos test that simply check that the service is launched and the relevant port is open?
There are plenty of examples in the nixos directory.

@malteneuss
Copy link
Contributor Author

malteneuss commented May 23, 2024

Done. Although i can't see what's wrong with that failing manual job.

edit: I think removing a lib.mdDoc call should do the trick.

nixos/tests/all-tests.nix Outdated Show resolved Hide resolved
NixOS already has good support for the Ollama
backend service. Now we can benefit from
having a convenient web frontend as well for it.
@malteneuss
Copy link
Contributor Author

I have to thank you! I got to see and learn a lot about nixpkgs in a few days what i have been planning to look at for a long time.

@drupol drupol merged commit d9062cd into NixOS:master May 24, 2024
25 of 26 checks passed
Comment on lines +46 to +47
default = 3000;
example = 3000;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This by default conflicts with a lot of services:

https://github.com/search?q=repo%3ANixOS%2Fnixpkgs+default+%3D+3000%3B&type=code

For example for an onion service setup when trying to run multiple of services that provide port 3000 it will require doing a port management that i would think would be better if it was handled by nix through probably doing a unique ports per services to make the setup more functional?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We would need something like systemd's "DynamicUser" feature that just picks one new, unique one, but for ports. For NixOS modules i don't care about the port if my servicce sits behind a nginx reverse-proxy. Do you know about such a feature?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We would need something like systemd's "DynamicUser" feature that just picks one new, unique one, but for ports. [...] Do you know about such a feature? -- @malteneuss (#313146 (comment))

Systemd doesn't seem to have this feature. Should we file a feature request for it?

For NixOS modules i don't care about the port if my servicce sits behind a nginx reverse-proxy. -- @malteneuss (#313146 (comment))

It's a minor annoyance in both scenarios 馃

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True. Although now that i think about it, i don't believe it would work if systemd picks a random port, because to configure e.g. nginx as a reverse-proxy, you need to determine the port to forward to at Nix build time (so beforehand). So, i would keep the default port for now the same as the upstream project does (for nextjs-ollama-llm-ui its port 3000)

Copy link
Member

@jopejoe1 jopejoe1 May 28, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There was some work being done in NixOS/rfcs#151 that got closed due to a lack of intresst.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the info. Unfortunately, i don't have enough time to move this forward.

@malteneuss malteneuss deleted the add-nextjs-ollama-llm-ui branch May 24, 2024 13:37
@malteneuss
Copy link
Contributor Author

Just wanted to mention that you can use a dev version with a convenient ollama response streaming fix here: #315184 (until the next release version arrives)

let
cfg = config.services.nextjs-ollama-llm-ui;
# we have to override the URL to a Ollama service here, because it gets baked into the web app.
nextjs-ollama-llm-ui = cfg.package.override { ollamaUrl = "https://ollama.lambdablob.com"; };
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't this be using the user's chosen ollamaUrl?

I can't get it working unless I override the package manually.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, this slipped through. It will be fixed with the next release #315184

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants