Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error on question #102

Open
Nefnief-tech opened this issue Apr 28, 2024 · 1 comment
Open

Error on question #102

Nefnief-tech opened this issue Apr 28, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@Nefnief-tech
Copy link

When choosing the modell llama3 i get this error when asking a question: han once.\n5. Answer in the same language as the question.}" backend-1 | 2024/04/28 15:15:47 INFO Entering chain tokens=10 backend-1 | 2024/04/28 15:15:47 INFO Entering chain tokens=83 backend-1 | Exiting chain with error: Post "host.docker.internal:11434": unsupported protocol scheme "host.docker.internal" backend-1 | Exiting chain with error: Post "host.docker.internal:11434": unsupported protocol scheme "host.docker.internal"

@Nefnief-tech Nefnief-tech added the bug Something isn't working label Apr 28, 2024
@iChristGit
Copy link

iChristGit commented Apr 29, 2024

I also get this error, the ollama install is on main pc (no docker)

There was another issue about it:

If you get unsupported protocol scheme 'host.docker.internal change

LLocalSearch/docker-compose.yaml

Line 6 in ed098b2

  • OLLAMA_HOST=${OLLAMA_HOST:-host.docker.internal:11434}
    OLLAMA_HOST=${OLLAMA_HOST:-host.docker.internal:11434}

to

OLLAMA_HOST=${OLLAMA_HOST:-http://host.docker.internal:11434}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants