Skip to content

VldmrB/exui

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

50 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ExUI

This is a simple, lightweight browser-based UI for running local inference using ExLlamaV2.

Overview of features

  • Friendly, responsive and minimalistic UI
  • Persistent sessions
  • Multiple instruct formats
  • Speculative decoding
  • Supports EXL2, GPTQ and FP16 models
  • Notepad mode

Screenshots

chat_screenshot chat_screenshot chat_screenshot chat_screenshot chat_screenshot chat_screenshot chat_screenshot chat_screenshot

Running locally

First, clone this repository and install requirements:

git clone https://github.com/turboderp/exui
cd exui
pip install -r requirements.txt

Then run the web server with the included server.py:

python server.py

Your browser should automatically open on the default IP/port. Config and sessions are stored in ~/exui by default.

Prebuilt wheels for ExLlamaV2 are available here. Installing the latest version of Flash Attention is recommended.

More to come

Stay tuned.

avatar_unicorn.png

About

Web UI for ExLlamaV2

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 48.2%
  • Python 30.3%
  • CSS 17.2%
  • HTML 3.2%
  • Jupyter Notebook 1.1%