Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with LLMS and Memgraph docker compose #555

Closed
antejavor opened this issue Mar 8, 2024 · 3 comments · Fixed by #703
Closed

Issue with LLMS and Memgraph docker compose #555

antejavor opened this issue Mar 8, 2024 · 3 comments · Fixed by #703
Assignees
Labels
priority: medium (missing info) An additional information can be helpful or interesting, but the absence is not disruptive

Comments

@antejavor
Copy link
Contributor

Document the issue from discord thread: https://discord.com/channels/842007348272169002/890968976627228752/1215640019168006205

@antejavor antejavor added the priority: medium (missing info) An additional information can be helpful or interesting, but the absence is not disruptive label Mar 8, 2024
@kgolubic
Copy link
Collaborator

@antejavor I was unable to reproduce the error. I've tried using Ollama both as a desktop app and within the Docker container. I've also used Docker and Docker compose to run the Memgraph MAGE docker image. Closing the issue. We can reopen if we get additional data.

@katarinasupe
Copy link
Contributor

As mentioned by the user host.docker.internal is needed on Mac or Windows if you have Lab running as a separate container from Memgraph/Memgraph MAGE. This should be noted here as well, as we talked about it on Slack (not sure where you added it, but I would add it to Docker Compose and GraphChat as the user mentioned).

@katarinasupe katarinasupe reopened this Apr 10, 2024
@kgolubic
Copy link
Collaborator

It is mentioned at https://memgraph.com/docs/getting-started/install-memgraph/docker#issues-when-connecting-to-memgraph-lab-to-memgraph.

This looks like an issue that has too many variables that influence the setup and results (OS, install method of Memgraph and Ollama, etc.). I think that the best approach for now would be to add Callouts to Docker Compose and GraphChat pages and to mention something along the lines "If you are having issues with connecting, try to use host.docker.internal instead of localhost or 127.0.0.1.".

I'll prepare PR for this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
priority: medium (missing info) An additional information can be helpful or interesting, but the absence is not disruptive
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants