diff --git a/docs/15_endpoint_apis/02_ollama_endpoint.ipynb b/docs/15_endpoint_apis/02_ollama_endpoint.ipynb index 1e6c096..16ffea9 100644 --- a/docs/15_endpoint_apis/02_ollama_endpoint.ipynb +++ b/docs/15_endpoint_apis/02_ollama_endpoint.ipynb @@ -9,7 +9,7 @@ "[Ollama](https://ollama.com/download) is a tool that downloads models to our computer and allows us to run them locally. Before executing the following code, you need to run `ollama run llama3:8b` once, if you didn't do this during setup. Also, depending on how you installed ollama, you may have to execute it in a terminal window using this command, before executing this notebook:\n", "\n", "```\n", - "ollam serve\n", + "ollama serve\n", "```\n", "\n", "As you will see, we access the local models offered via ollama using the OpenAI API as shown before. We just exchange the `base_url` and we do not need to provide an API-Key."