From 5fcf0baaa4b6f9406339ff1221950ce8de0c39f1 Mon Sep 17 00:00:00 2001 From: Martin Czygan Date: Sun, 9 Nov 2025 15:26:38 +0100 Subject: [PATCH] docs 15/02: fix typo --- docs/15_endpoint_apis/02_ollama_endpoint.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/15_endpoint_apis/02_ollama_endpoint.ipynb b/docs/15_endpoint_apis/02_ollama_endpoint.ipynb index 1e6c0966..16ffea9b 100644 --- a/docs/15_endpoint_apis/02_ollama_endpoint.ipynb +++ b/docs/15_endpoint_apis/02_ollama_endpoint.ipynb @@ -9,7 +9,7 @@ "[Ollama](https://ollama.com/download) is a tool that downloads models to our computer and allows us to run them locally. Before executing the following code, you need to run `ollama run llama3:8b` once, if you didn't do this during setup. Also, depending on how you installed ollama, you may have to execute it in a terminal window using this command, before executing this notebook:\n", "\n", "```\n", - "ollam serve\n", + "ollama serve\n", "```\n", "\n", "As you will see, we access the local models offered via ollama using the OpenAI API as shown before. We just exchange the `base_url` and we do not need to provide an API-Key."