Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: package as executable (desktop app) #470

Open
tjbck opened this issue Jan 13, 2024 · 22 comments
Open

feat: package as executable (desktop app) #470

tjbck opened this issue Jan 13, 2024 · 22 comments
Labels
core core feature enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed
Milestone

Comments

@tjbck
Copy link
Contributor

tjbck commented Jan 13, 2024

Ideally we would want something like Ollama

Maybe we could use https://pyinstaller.org/en/stable/

@tjbck tjbck added enhancement New feature or request help wanted Extra attention is needed labels Jan 13, 2024
@justinh-rahb
Copy link
Collaborator

justinh-rahb commented Jan 14, 2024

More options for deployment are always a great idea. Packaging on macOS can bit a bit of a chore, you'll need a $99/yr developer account to create signed/notarized binaries. Not sure if you need to pay for signing Windows binaries or not, but it's at least easier to bypass for the user if you don't.

@tjbck tjbck changed the title feat: package as executable feat: package as executable (desktop app) Jan 19, 2024
@tjbck tjbck pinned this issue Jan 19, 2024
@oliverbob
Copy link
Contributor

oliverbob commented Jan 19, 2024

More options for deployment are always a great idea. Packaging on macOS can bit a bit of a chore, you'll need a $99/yr developer account to create signed/notarized binaries. Not sure if you need to pay for signing Windows binaries or not, but it's at least easier to bypass for the user if you don't.

While it is seems impractical to do it with Mac OS, it isn't the case on Windows and Linux, Android, etc. But I believe there is a way to package on multiplatform using flutter.

Thought that would perhaps mean creating a new sibling for flutter project. It allows you to create apps for desktop, and according to flutter docs, including iOS. Tried it in the past but didn't have mac. It works on all platforms though. The snap store also works as executable. Tried it in some apple devices to work.

@justinh-rahb
Copy link
Collaborator

justinh-rahb commented Feb 15, 2024

On the packaging front, I am actively looking into Flatpak:

Manifest:

app-id: org.ollama-webui.Ollama-WebUI
runtime: org.freedesktop.Platform
runtime-version: '22.08'
sdk: org.freedesktop.Sdk
command: start-webui.sh
finish-args:
  - --share=ipc
  - --socket=x11
  - --socket=wayland
  - --share=network
  - --filesystem=home
  - --device=dri
  - --env=ENV=prod
  - --env=SCARF_NO_ANALYTICS=true
  - --env=DO_NOT_TRACK=true
modules:
  - name: nodejs
    buildsystem: simple
    build-commands:
      - npm install
      - npm run build
      - mkdir -p /app/frontend
      - cp -r build/* /app/frontend/
    sources:
      - type: git
        url: https://github.com/ollama-webui/ollama-webui
        tag: main  # Specify the correct branch or tag here
      - type: archive
        url: https://chroma-onnx-models.s3.amazonaws.com/all-MiniLM-L6-v2/onnx.tar.gz
        dest: /app/data/onnx_models
  - name: python-backend
    buildsystem: simple
    build-commands:
      - pip3 install --no-cache-dir torch torchvision torchaudio -f https://download.pytorch.org/whl/cpu
      - pip3 install --no-cache-dir -r backend/requirements.txt
      - install -D backend/start.sh /app/bin/start-webui.sh
      - cp -r backend/* /app/backend/
      - cp -a /app/data/onnx_models /root/.cache/chroma/onnx_models
    sources:
      - type: git
        url: https://github.com/ollama-webui/ollama-webui
        tag: main  # Specify the correct branch or tag here
    post-install:
      - mkdir -p /app/bin
      - echo -e '#!/bin/sh\nexec /app/backend/start.sh' > /app/bin/start-webui.sh
      - chmod +x /app/bin/start-webui.sh

Workflow:

name: Build and Release Flatpak

on:
  push:
    tags:
      - 'v*'

permissions:
  contents: write
  id-token: write

jobs:
  flatpak-build-and-release:
    runs-on: ubuntu-latest
    steps:
      - name: Checkout repository
        uses: actions/checkout@v4

      - name: Install Flatpak and Flatpak Builder
        run: |
          sudo apt-get update -y
          sudo apt-get install -y flatpak flatpak-builder

      - name: Add Flathub repository
        run: flatpak remote-add --if-not-exists flathub https://flathub.org/repo/flathub.flatpakrepo

      - name: Build Flatpak application
        run: flatpak-builder --force-clean build-dir org.ollama.WebUI.yaml

      - name: Create bundle
        run: flatpak build-bundle build-dir ollama-webui.flatpak org.ollama.WebUI

      - name: Create Release and Upload Asset
        uses: softprops/action-gh-release@v1
        with:
          name: ${{ github.ref_name }}
          body: |
            ## Release ${{ github.ref_name }} of Ollama WebUI
            
            ### 🚀 New Features
            - List new features here
            - Improvements or bug fixes
      
            ### 📦 Installation Instructions
            To install Ollama WebUI on your system, follow these steps:
            1. Ensure Flatpak is installed on your system.
            2. Download `ollama-webui.flatpak`.
            3. Install the application using `flatpak install ollama-webui.flatpak`.
      
          tag_name: ${{ github.ref_name }}
          files: |
            ollama-webui.flatpak
          token: ${{ secrets.GITHUB_TOKEN }}
          draft: true # Adjust these as preferred
          prerelease: true # Adjust these as preferred

I'll wait til after the rename to start the PR.

@tjbck
Copy link
Contributor Author

tjbck commented Feb 16, 2024

We should also have some thing like these from oobabooga/text-generation-webui, so that users can install without docker with one command as well and streamline the installation process:

https://github.com/oobabooga/text-generation-webui/blob/main/start_linux.sh
https://github.com/oobabooga/text-generation-webui/blob/main/start_macos.sh
https://github.com/oobabooga/text-generation-webui/blob/main/start_windows.bat

@tjbck tjbck added good first issue Good for newcomers core core feature labels Feb 16, 2024
@tjbck tjbck added this to the v1.0 milestone Feb 16, 2024
@justinh-rahb
Copy link
Collaborator

We should also have some thing like these from oobabooga/text-generation-webui, so that users can install without docker with one command as well and streamline the installation process:

I had thought of doing exactly this, I've got macOS and Linux mostly hammered out already for my own purposes. Since we're on the same page I'll get those polished up and ready for PR too 👍

@tjbck
Copy link
Contributor Author

tjbck commented Feb 22, 2024

We might also want to look into providing installation option using pip install

@dz0ny
Copy link

dz0ny commented Feb 25, 2024

One can use pipx it's far better than pip which often influnces whatever else you have installed on device.

@justinh-rahb
Copy link
Collaborator

justinh-rahb commented Feb 25, 2024

@dz0ny another we attempted before and might try again once they've worked out some bugs is uv

@cocktailpeanut
Copy link
Contributor

Hey guys, i created one https://x.com/cocktailpeanut/status/1763254738177462672

Basically I work on a project called pinokio, which is sort of like a browser but for automating anything on your computer, which can be used for installing and running and managing AI apps in native format (no need to mess with terminal stuff).

And I became a fan of this project lately and have been using it daily, so decided to write a 1 click launcher script for this. Hope you enjoy.

@justinh-rahb
Copy link
Collaborator

That's awesome @cocktailpeanut, glad to see you've kept busy. Thanks for the shoutout! 🫶

@J-eremy
Copy link

J-eremy commented Mar 11, 2024

There are large issues when it comes to Windows and pretty much all Python packagers. Especially when it comes to single file packaged. Windows Defender picks up everything that isn't signed (See extortion). If Microsoft could get away not allowing you to install anything 3rd party not approved by them they would do that definitely. See S-Mode

@tjbck tjbck unpinned this issue Mar 15, 2024
@nightboysfm
Copy link

nightboysfm commented Mar 23, 2024

We should also have some thing like these from oobabooga/text-generation-webui, so that users can install without docker with one command as well and streamline the installation process:

https://github.com/oobabooga/text-generation-webui/blob/main/start_linux.sh https://github.com/oobabooga/text-generation-webui/blob/main/start_macos.sh https://github.com/oobabooga/text-generation-webui/blob/main/start_windows.bat

Here it is, a one-click installation.
Create open-webui.bat, copy paste the content below and put in a folder (no spaces in the paths !), launch it and let the magic happening. It will download and put all the dependencies in a subfolder and create a python venv to keep things clean. Git is the only thing installed on system.
I think some paths are hard written in config files, so if you move the folder you might need to fix errors or just reinstall the thing.
It should be some improvement to do, but it works fine like this.

Content of the "open-webui.bat": https://pastebin.com/527wvn0k

If you want to enter the existing venv and make changes you can make a "cmd_venv.bat" (or a name you like) and start it: https://pastebin.com/wNArfua2

I hope it will help.

@tjbck
Copy link
Contributor Author

tjbck commented Mar 23, 2024

@nightboysfm Looks promising, feel free to make a PR!

@muhanstudio
Copy link

This will be an excellent milestone. If you can achieve the Desktop app, or even the Android APP, you can change the condition of the LLM API application. We know that most of the excellent open source models are not good clients without good clients. , But maybe this project can bring hope, quickly bring the open source model to everyone who only clicks the EXE file or the APK file.

@motin
Copy link

motin commented Jun 7, 2024

How about forking Ollama, re-using all that existing packaging already in place for OSX, Windows, Linux and also bundle open webui next to ollama?

@justinh-rahb
Copy link
Collaborator

How about forking Ollama, re-using all that existing packaging already in place for OSX, Windows, Linux and also bundle open webui next to ollama?

Packaging isn't the hard part, doing it accordingly with with certificate signing for various platforms is the hairy part everyone wishes to avoid.

@motin
Copy link

motin commented Jun 9, 2024

Packaging isn't the hard part, doing it accordingly with with certificate signing for various platforms is the hairy part everyone wishes to avoid.

I'd include certificate signing in the scope of packaging. Ollama has it figured out technically and it is obviously working (https://github.com/ollama/ollama/blob/main/.github/workflows/release.yaml) and the overlap of users installing Ollama desktop app and Open WebUI must be large. Agree that organizationally there are other challenges in how to set up and maintain the various developer programs/registrations, if that is what you mean as the hairy part I agree, but one route is to officially be shipped with the Ollama desktop app, using Ollama's certificates.

@justinh-rahb
Copy link
Collaborator

justinh-rahb commented Jun 10, 2024

I'd include certificate signing in the scope of packaging. Ollama has it figured out technically and it is obviously working (https://github.com/ollama/ollama/blob/main/.github/workflows/release.yaml) and the overlap of users installing Ollama desktop app and Open WebUI must be large. Agree that organizationally there are other challenges in how to set up and maintain the various developer programs/registrations, if that is what you mean as the hairy part I agree, but one route is to officially be shipped with the Ollama desktop app, using Ollama's certificates.

While I'd love nothing more than the ultimate Ollama x WebUI collab by having them include us in their installer packages, I highly doubt they'd go for that. And I'd understand their reasons why probably.

Indeed you are correct, the main sticking point is managing the various developer accounts and credentials required with the platform-owners, which are also not without cost.

@motin
Copy link

motin commented Jun 10, 2024

How about a bring-your-own-credentials model, where it is made easy to package and sign executables / desktop apps, but by default they are not signed (Ollama does this by only signing if SIGN=1 env var is set when running the bundling scripts). This way, anyone that needs signed artifacts (e.g. for distribution within their org) can do it themselves by following simple instructions.

@snakeying
Copy link

As I mentioned before, this is undoubtedly the best open-source UI I have ever experienced on GitHub. However, for most people new to Docker, the installation process can be a nightmare. Even with detailed instructions, they still find the CLI too technical. Over the past few days, I recommended open-webui to several friends, but they all complained about the lack of a straightforward installation method. They preferred AnythingLLM because it only requires downloading and clicking to install, without the need for WSL, Docker, or similar tools. This proves that a simple installation process is a crucial aspect of the user experience.

I suggest we offer installation prompts similar to AnythingLLM, such as those found here: https://docs.useanything.com/installation/desktop/windows. We could add the following to our installation guide:

Application is not signed!
➤ The Open-webui Windows application is currently unsigned and Windows Defender or other anti-virus software might flag the application as malicious.
➤ If you do not want to bypass that alert for any reason, please use open-webui in another way.

I understand this is not a perfect solution, but it at least provides more options for users and helps improve their overall experience. As @muhanstudio said, "quickly bring the open-source model to everyone who only clicks the EXE file or the APK file." Isn't user-friendliness one of the most important features of open-webui?

I sincerely hope this project continues to thrive and bring a better experience to more people. Once again, thank you to the development team and all the contributors for your hard work.

@RustoMCSpit
Copy link

They preferred AnythingLLM because it only requires downloading and clicking to install, without the need for WSL, Docker, or similar tools

my friends use podman and havent heard of docker (surprisingly) so the only docker instructions confused them

@zhouxihong1
Copy link

I have packaged a Windows executable program for open-webui. Those in need can click the link to download it. It also supports modifying the .env file and is suitable for Windows x64.

https://github.com/zhouxihong1/open-webui/releases/download/V0.3.10/start_open_webui.7z

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
core core feature enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests