A simple web interface with Markdown support built using Flask, designed to interact with Ollama models.
- π¬ LLM Chat: Chat with your Ollama instance.
- π Markdown Support: Chat messages are rendered in markdown.
- π Multiple Conversations: Have multiple conversation.
- π Chat History: Chat history is saved locally.
- βοΈ Custom Prompts: Create and save custom prompts for you models.
- β© Message Stream: Messages are streamed by the llm.
Use the package manager pip to install the dependencies.
pip install -r requirements.txt
-
Follow the Nix installation instructions to set up
nix
. -
Enter the development shell from the flake.nix.
nix develop
To run the application, navigate to the root directory and execute:
Warning
This command uses the Flask server in debug mode.
FLASK_APP=app/views FLASK_ENV=development flask --debug run
By default a demo LLM is used. If you want to use your own models with Ollama you have to edit the app/app.py file.
# set the model name to your model or leave it empty for the demo
MODEL_NAME = "" # "llama3.2:3b-instruct-q5_K_M"
You can use markdown commands in your messages.
Answers by the model in markdown are rendered.
Parts of the messages enclosed in [
and ]
are hidden and can be made visble while hovering over it or by clicking on it.
This project is licensed under the MIT License. Feel free to modify and use it as you wish!