Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not working out of the box ! #117

Open
NSP-0123456 opened this issue Aug 27, 2024 · 2 comments
Open

Not working out of the box ! #117

NSP-0123456 opened this issue Aug 27, 2024 · 2 comments
Labels
enhancement New feature or request

Comments

@NSP-0123456
Copy link

Current github seems to not embed any ollama engine and no quick installation document is provided.

A clear and concise description of the prerequisite and also ollama installation and config document. A better approach could be to also embed the ollama install script inside this repository for docker.

Currently the instruction are useless are it is not working out of the box.

Please elaborate on ollama part. I do not have any instance on my machine and if i get the latest one using docker hub docker pull ollama/ollama it is not not working.

@NSP-0123456 NSP-0123456 added the enhancement New feature or request label Aug 27, 2024
@Arnaud3013
Copy link

Arnaud3013 commented Sep 3, 2024

2 big step, first is running ollama with a model ->install ollama, use open-webui to manage it ; second is running docker instance of llocalsearch

1 -> install ollama -> ollama
2 -> with docker, install open-webui (open-webui) with this command
in shell (cmd)
git clone https://github.com/open-webui/open-webui
cd open-webui
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
go -> http://localhost:3000/
down left side / admin panels
go settings -> in part models enter mistral:v0.3, and on the right click on icon download
you are good with ollama. if you want another model go https://ollama.com/ and serah your model, then look for tags you want (like llama3.1:8b but it didn't work well)
now llocalsearch
go back in main folder inside shell (cd ..)
git clone https://github.com/nilsherzig/LLocalSearch
edit docker-compose.yaml in order to change port line 20. change '3000:80' to '3001:80'
docker-compose up -d
go http://localhost:3001/chat/new
In top right, you should be able to select your mistral model in The agent chain is using the model ""
close.
it should be working -> try asking something

@gardner
Copy link

gardner commented Sep 4, 2024

You want to set: OLLAMA_HOST

Please review OLLAMA_GUIDE.md

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants