From 492f42b572248063a529ce3cff54a8bde55d96e4 Mon Sep 17 00:00:00 2001 From: Omkar Bhoir <71968255+omkar787@users.noreply.github.com> Date: Thu, 25 Apr 2024 22:30:00 +0530 Subject: [PATCH] Note added in readme about the fact that the ollama service should be running before using ollama python library and also details about how to solve the connection refused error while running ollama in a jupyter notebook is also added --- README.md | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) diff --git a/README.md b/README.md index 2864917..ed71866 100644 --- a/README.md +++ b/README.md @@ -21,6 +21,23 @@ response = ollama.chat(model='llama2', messages=[ print(response['message']['content']) ``` +### Note +To start using this library, you need to have ollama running on your system. If you are running ollama on a jupyter notebook, then you can use the below code to fix the "connection refused" error + +```python +import os +import threading +import subprocess + +def ollama(): + os.environ['OLLAMA_HOST'] = '0.0.0.0:11434' + os.environ['OLLAMA_ORIGINS'] = '*' + subprocess.Popen(["ollama", "serve"]) + +ollama_thread = threading.Thread(target=ollama) +ollama_thread.start() +``` + ## Streaming responses Response streaming can be enabled by setting `stream=True`, modifying function calls to return a Python generator where each part is an object in the stream.