Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed Docker version with Ollama #1812

Merged
merged 1 commit into from Apr 1, 2024
Merged

Conversation

mrepetto-certx
Copy link
Contributor

Given that we have Ollama as the new recommended setup, I propose to make a modification to the Dockerfile and compose it to make it coherent with the new local setup. I decoupled Ollama from privateGPT, making it a separate microservice.

Copy link
Collaborator

@imartinez imartinez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes total sense, thanks!

@imartinez imartinez merged commit f83abff into zylon-ai:main Apr 1, 2024
6 checks passed
@qdm12
Copy link

qdm12 commented Apr 2, 2024

Makes total sense, thanks!

Ehh we should really stop adding things to that poetry extras command. I don't use Ollama and now the default Docker image comes with Ollama extras I have no use for. We should either bundle ALL extras, or none at all in my opinion (see below how this would work).
I opened PR #1792 to have a build argument POETRY_EXTRAS so that you can build your Docker image with --build-arg POETRY_EXTRAS="..." as you wish. If we're ok with breaking compatibility still, I think it would be wise to have default to an empty string POETRY_EXTRAS="" and document what extras are available for docker build commands.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants