Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama: Forbidden (403) while fetching models on localhost #276

Closed
sudhamjayanthi opened this issue Dec 14, 2023 · 3 comments
Closed

Ollama: Forbidden (403) while fetching models on localhost #276

sudhamjayanthi opened this issue Dec 14, 2023 · 3 comments
Milestone

Comments

@sudhamjayanthi
Copy link

Describe the bug
Fails to pull models from Ollama server on the web version when clicking on fetch models button!

[Issue] Ollama: Forbidden (403) - is http://localhost:11434/api/tags accessible by the server?

Where is it happening?

To Reproduce
Steps to reproduce the behavior:

  1. Run Ollama Server locally with OLLAMA_ORIGINS=https://get.big-agi.com/ ollama serve

    image

  2. Go to the web client.

  3. Go to Models > Add > Ollama > Click on models button

    image

Expected behavior
It pulls the following models from the server without any issues. (screenshot shows the JSON when I hit the same API endpoint in my browser)

image

Screenshots / context
Attached in relevant places above.

@sudhamjayanthi sudhamjayanthi changed the title [BUG] [Issue] Ollama: Forbidden (403) - is http://localhost:11434/api/tags accessible by the server? Dec 14, 2023
@sudhamjayanthi sudhamjayanthi changed the title [Issue] Ollama: Forbidden (403) - is http://localhost:11434/api/tags accessible by the server? Ollama: Forbidden (403) while fetching models on web version Dec 14, 2023
@sudhamjayanthi sudhamjayanthi changed the title Ollama: Forbidden (403) while fetching models on web version [BUG] Ollama: Forbidden (403) while fetching models on web version Dec 14, 2023
@enricoros enricoros changed the title [BUG] Ollama: Forbidden (403) while fetching models on web version Ollama: Forbidden (403) while fetching models on localhost Dec 15, 2023
@enricoros
Copy link
Owner

enricoros commented Dec 15, 2023

Hi @sudhamjayanthi thanks for reporting - good for us to document this expected behavior.

The following is the issue:

  • the server running on get.big-agi.com (a Vercel server on the cloud) is trying to connect to "localhost:11434"
  • that cloud machine does not have an ollama running
  • what the user wants is to connect on their own local machine, but that's inaccessible from the Vercel cloud

A quick schematic:

  • Big-AGI Frontend, running on your Browser ✅ ---> big-AGI backend running on Vercel ✅ ---> "Localhost":11434 ❌ network inaccessible (as the error message also says)

A few ways to fix this issue:

  • run big-AGI locally in production mode with npm i && npm run build && npm run start (see README.md for details)
    • this will make sure that both big-AGI and ollama are running on localhost and can talk to each other
  • run big-AGI locally with Docker (and see our Docker Deployment documentation, to make sure it's on the host network)
  • expose your localhost:11434 Ollama service to the internet (or make it accessible to the big-AGI server, and use a full IP address to reach it)
    • this depends on your OS/Networking skills, etc.

Let me know if this helps to fix this networking configuration issue.

Summary:
image

Edit: I've updated the Ollama Deployment docs docs/config-ollama.md to reflect this.

@enricoros
Copy link
Owner

Closing as an extensive doc is now provided in the app docs.

@enricoros enricoros added this to the 1.8.0 milestone Dec 20, 2023
@enricoros enricoros mentioned this issue Dec 20, 2023
20 tasks
@Shadow-Wizard-Money-Gang

i had the netwrok error previsouly on #276 and I tried their solutions and am now running big-AGI in a docker container and ollama locally, how do I connect them though as I am now getting a new error: [Issue] Ollama: (network) fetch failed - Error: connect ECONNREFUSED 127.0.0.1:11434

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants