Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not working on Fresh install - tested on 2 different systems. #220

Closed
braindotai opened this issue Jun 24, 2024 · 7 comments
Closed

Not working on Fresh install - tested on 2 different systems. #220

braindotai opened this issue Jun 24, 2024 · 7 comments
Labels
bug Something isn't working

Comments

@braindotai
Copy link

Describe the bug
Not working on fresh installed, steps followed from README.md

To Reproduce
Steps to reproduce the behavior:

  1. git clone
  2. Chang ollama url to http://host.docker.internal:11434 as mentioned in the readme. Its my local ollama, both ollama and perpliexica is on same laptop.
  3. docker compose up .... starts throwing error
  4. On entering on query on localhost:3000 does nothing.

Expected behavior
After docker compose up all should work fine, when entering a query it should ping my local ollama server and prepare and show the results.

Additional context
Here are full logs:
perplexica-docker-compose-up.txt

@braindotai braindotai added the bug Something isn't working label Jun 24, 2024
@ItzCrazyKns
Copy link
Owner

Seems like Perplexica is not being able to connect to Ollama. What OS are you using?

@braindotai
Copy link
Author

Seems like Perplexica is not being able to connect to Ollama. What OS are you using?

I am using arch Linux. http://localhost:11434 is saying Ollama is running. I don't think os is the issue, I've downloaded other open source stuff like continue extension in vscode, that works just fine with my local ollama.

@ItzCrazyKns
Copy link
Owner

ItzCrazyKns commented Jun 24, 2024 via email

@braindotai
Copy link
Author

braindotai commented Jun 24, 2024

I've tried not. here's what I did.

  1. $ ip addr show # copied the inet value from wlan0
  2. Changed ollama to this OLLAMA = "http://<my_machine_ip>:11434"
  3. $ sudo ufw allow 11434/tcp
  4. Stopped Ollama and restarted it with $ OLLAMA_HOST=0.0.0.0 ollama serve

Still not working

perplexica-backend-1   | error: Error loading Ollama models: TypeError: fetch failed
perplexica-backend-1   | error: Error loading Ollama embeddings: TypeError: fetch failed
perplexica-backend-1   | error: undefined
perplexica-backend-1   | error: Error loading Ollama models: TypeError: fetch failed
perplexica-backend-1   | error: Error loading Ollama embeddings: TypeError: fetch failed
perplexica-backend-1   | error: undefined

@ItzCrazyKns
Copy link
Owner

Try sending a curl request to the same URL

@braindotai
Copy link
Author

Screenshot from 2024-06-25 15-47-53

All is good with the ollama url.

@braindotai
Copy link
Author

Oh wait, I needed to change the settings and select ollama manually. Got it working now!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants