This repo is the backend part for the Obsidian plugin found here: https://github.com/brumik/obsidian-ollama-chat
- ollama running on
localhost:11434
- read further: https://ollama.ai/ - you shoud have installed a model (like 'mistral')
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
cp .env.sample .env
- Fill out the env file to your liking
python index.py
I am looking into ways how to create a Dockerfile and copmose file that sets up the python app for you but I am running into problems with networking. If you are a docker virtuoso I am happy to accept your help.
Further I am planning to look into how to deploy as an executable but python is not my main language so any recommendations are welcome.
Feel free to open an issue if you run into one or you would like to see a feature.
As every programmer I convert coffee to code: