With this project you can use "Chatbot-UI" as user interface for your Rivet projects! This allows you to create complex LLM based processes (e.g. a teachable assistant) in a visual programming interface and interact with it via a beautiful chat UI. Chatbot-UI also keeps all the conversation history, so we do not need to care about that!
Features:
- Creates an OpenAI SDK compatible API for any rivet project
- Captures streaming output from a configured node
- Streams the output
- Transforms messages, before sending them to the rivet graph
- Beautiful Chat-UI (provided by Chatbot-UI)
- Chatbot-UI features: Multiple chats with conversation history, Integrated RAG etc.
Currently not supported (maybe added in the future):
- System prompts and LLM settings (e.g. temperature) set in Chatbot-UI interface are currently not being send to the graph
- Tools added in Chatbot-UI will not be passed
For simplicity all is explained for Visual Studio Code. You can of course run it in other IDEs.
- Install Visual Studio Code: https://code.visualstudio.com/download
- Install node.js + node package manager: https://nodejs.org/en/download/ (in the install process, make sure you also install npm package manager!)
- Install Github: https://desktop.github.com/
- Open terminal or command line
- Enter
git clone https://github.com/ai-made-approachable/rivet-chat-api.git
- Open the folder in Visual Studio Code (File -> Open Folder)
- Open "Terminal -> New Terminal" and enter
npm install
- Go to /.vscode/ folder
- Rename "launch_sample.json" to "launch.json"
- Open "launch.json" and replace the value for OPEN_API_KEY with your OpenAI Key
Just press "Run -> Start Debugging" in Visual Studio Code.
- Make sure your project file has an input of type "chat-message" and array checked (Type: chat-message[])
- Open
config/default.json
- Change the values accordingly to your graph (file, graphName, graphInput ...)
We are using "Chatbot-UI" as it is a very user friendly UI (similiar to ChatGPTs UI): https://github.com/mckaywrigley/chatbot-ui
- Install Docker: https://docs.docker.com/engine/install/
Note: This installation is a bit long, but it is a one time thing!
- Open terminal (MacOs) or command line (Windows)
git clone https://github.com/mckaywrigley/chatbot-ui.git
- Navigate into the folder of the repository
cd chatbot-ui
- Install dependencies
npm install
- Install superbase:
npm install supabase
- Make sure Docker is running
- Start Supabase
supabase start
- Create file .env.local
cp .env.local.example .env.local
- Get the required values by running
supabase status
- Copy "API URL" value
- Open ".env.local" in Visual Studio Code (in chatbot-ui root folder)
- Insert copied value for "NEXT_PUBLIC_SUPABASE_URL" and save
- Open
supabase/migrations/20240108234540_setup.sql
file - Replace "service_role_key" with the value from
supabase status
and save
Note: Also see instructions on: https://github.com/mckaywrigley/chatbot-ui
- Make sure Docker is running
- Navigate to your "chatbot-ui" folder
- Enter
supabase start
- Enter
npm run chat
- Navigate to the URL shown to you, usually: http://localhost:3000
- When you start "chatbot-ui" for the first time enter e-mail + password (don't worry, all stays locally on your pc)
- In the sidebar press on "Models" (Stars-Icon) and on "New Model"
- Enter any name. but use
gpt-4-turbo-preview
as ModelID andhttp://localhost:3100/
as Base URL. For API Key you can enter anything - Open the model selection in the top-right corner and select your custom model
- Have fun chatting