Skip to content

How to setup with Ollama

Long Huynh edited this page May 23, 2024 · 8 revisions

Ollama Setup Instructions

Additional set is required to use Ollama with Obsidian BMO Chatbot.

These instructions are specific to MacOS.

For additional resources, check out: How to Handle CORS Settings in OLLAMA: A Comprehensive Guide | by Durgaprasad Budhwani | DCoderAI | Mar, 2024 | Medium

Install Ollama

  1. Go to the Ollama website.
  2. Click on 'Download' and select your operating system. (Ollama Download Page)
  3. After downloading and installing the application, quit the Ollama application from running in the background.

Configure CORS in Ollama on MacOS. There are several options:

Option 1

  1. Quit the Ollama application from running in the background.
  2. Open your terminal and run OLLAMA_ORIGINS=app://obsidian.md* ollama serve
  3. Go to 'BMO Chatbot Settings > Ollama Connection > OLLAMA REST API URL' and insert Ollama's default url: http://localhost:11434 into OLLAMA REST API URL under 'Ollama Connection'

Option 2

Another option is to set the server on a different port.

  1. Open the terminal and run OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve or OLLAMA_ORIGINS=app://obsidian.md* OLLAMA_HOST=127.0.0.1:11435 ollama serve. This will allow you to run a separate server that will bypass the CORS policy within Obsidian.
  2. Go to 'BMO Chatbot Settings > Ollama Connection > OLLAMA REST API URL' and insert Ollama's default url: http://localhost:11435 into OLLAMA REST API URL under 'Ollama Connection'

Option 3

  1. Open your terminal and run launchctl setenv OLLAMA_ORIGINS "app://obsidian.md*" (To unset, launchctl unsetenv OLLAMA_ORIGINS)
  2. Restart the Ollama application.
  3. Go to 'BMO Chatbot Settings > Ollama Connection > OLLAMA REST API URL' and insert Ollama's default url: http://localhost:11434 into OLLAMA REST API URL under 'Ollama Connection'

3. Installing models

  1. Go to https://ollama.com/library
  2. Install a model (e.g. ollama run llama3)
  3. Reload the model (It is located in 'General > Model')
  4. In the "Model" dropdown, select any model that you downloaded with Ollama.