Skip to content

This project support a WEB UI with Vicuna13B (using llama-cpp-python, chatbot-ui)

Notifications You must be signed in to change notification settings

blackcon/VicunaWithGUI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 

Repository files navigation

VicunaWithGUI

This project support a WEB UI with Vicuna13B (using llama-cpp-python, chatbot-ui)

1. Preview

1) Chat-UI (url: http://localhost:3000/en)

image

2) 🦙 llama.cpp Python API (url: http://localhost:8000/docs)

image

2. How to use it?

1) Clone this repo

git clone https://github.com/blackcon/VicunaWithGUI.git
cd VicunaWithGUI

2) Download model (eachadea/ggml-vicuna-13b-4bit)

mkdir models
cd models
wget https://huggingface.co/eachadea/ggml-vicuna-13b-4bit/resolve/main/ggml-vicuna-13b-4bit-rev1.bin

3) Run API server (llama-cpp-python)

# clone as submodule
cd VicunaWithGUI
git submodule add https://github.com/abetlen/llama-cpp-python.git
git clone https://github.com/ggerganov/llama.cpp.git llama-cpp-python/vendor/llama.cpp

# change head
cd llama-cpp-python/vendor/llama.cpp
git checkout 1d78fecdab4087028a38517e86ed129f077174d8

# setup llama-cpp-python
cd llama-cpp-python
pip3 install scikit-build cmake fastapi sse_starlette uvicorn pydantic-settings
python3 setup.py develop

# Run
MODEL=`pwd`/../models/ggml-vicuna-13b-4bit-rev1.bin HOST=127.0.0.1 python3 -m llama_cpp.server

4) Run chatbot-UI (chatbot-ui)

cd VicunaWithGUI
git submodule add https://github.com/mckaywrigley/chatbot-ui.git

cd chatbot-ui
# install
npm i

# build
npm run build

# run as dev
OPENAI_API_HOST=http://127.0.0.1:8000 OPENAI_API_KEY=blah npm run dev

Star History

Star History Chart

About

This project support a WEB UI with Vicuna13B (using llama-cpp-python, chatbot-ui)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published