Skip to content

A Web Interface for chatting with your local LLMs via the ollama API

License

Notifications You must be signed in to change notification settings

glepretre/ollama-gui

 
 

Repository files navigation

Ollama GUI logo

Ollama GUI: Web Interface for chatting with your local LLMs.

Ollama GUI is a web interface for ollama.ai, a tool that enables running Large Language Models (LLMs) on your local machine.

🛠 Installation

Prerequisites

  1. Download and install ollama CLI.
  2. Download and install yarn and node
ollama pull <model-name>
ollama serve

Getting Started

  1. Clone the repository and start the development server.
git clone https://github.com/HelgeSverre/ollama-gui.git
cd ollama-gui
yarn install
yarn dev

Or use the hosted web version, by running ollama with the following origin command (docs)

OLLAMA_ORIGINS=https://ollama-gui.vercel.app ollama serve

Models

For convenience and copy-pastability, here is a table of interesting models you might want to try out.

For a complete list of models Ollama supports, go to ollama.ai/library.

Model Parameters Size Download
Mixtral-8x7B Large 7B 26GB ollama pull mixtral
Phi 2.7B 1.6GB ollama pull phi
Solar 10.7B 6.1GB ollama pull solar
Dolphin Mixtral 7B 4.1GB ollama pull dolphin-mixtral
Mistral 7B 4.1GB ollama pull mistral
Mistral (instruct) 7B 4.1GB ollama pull mistral:7b-instruct
Llama 2 7B 3.8GB ollama pull llama2
Code Llama 7B 3.8GB ollama pull codellama
Llama 2 Uncensored 7B 3.8GB ollama pull llama2-uncensored
Orca Mini 3B 1.9GB ollama pull orca-mini
Vicuna 7B 3.8GB ollama pull falcon
Vicuna 7B 3.8GB ollama pull vicuna
Vicuna (16K context) 7B 3.8GB ollama pull vicuna:7b-16k
Vicuna (16K context) 13B 7.4GB ollama pull vicuna:13b-16k
nexusraven 13B 7.4gB ollama pull nexusraven
starcoder 7B 4.3GB ollama pull starcoder:7b
wizardlm-uncensored 13B 7.4GB ollama pull wizardlm-uncensored

📋 To-Do List

  • Properly format newlines in the chat message (PHP-land has nl2br basically want the same thing)
  • Store chat history using IndexedDB locally
  • Cleanup the code, I made a mess of it for the sake of speed and getting something out the door.
  • Add markdown parsing lib
  • Allow browsing and installation of available models (library)
  • Ensure mobile responsiveness (non-prioritized use-case atm.)
  • Add file uploads with OCR and stuff.

🛠 Built With


📝 License

Licensed under the MIT License. See the LICENSE.md file for details.

About

A Web Interface for chatting with your local LLMs via the ollama API

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Vue 51.3%
  • TypeScript 43.7%
  • HTML 3.2%
  • JavaScript 1.1%
  • CSS 0.7%