Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix docker compose up and MacBook segfault #1428

Open
wants to merge 5 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
25 changes: 16 additions & 9 deletions Dockerfile.local
Original file line number Diff line number Diff line change
@@ -1,13 +1,15 @@
### IMPORTANT, THIS IMAGE CAN ONLY BE RUN IN LINUX DOCKER
### You will run into a segfault in mac
FROM python:3.11.6-slim-bookworm as base

# Install poetry
RUN pip install pipx
RUN python3 -m pipx ensurepath
RUN pipx install poetry
ENV PATH="/root/.local/bin:$PATH"
# Set the environment variable for the file URL (can be overwritten)
ENV FILE_URL="https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.1-GGUF/resolve/main/mistral-7b-instruct-v0.1.Q4_K_M.gguf"
# Set the predefined model name (can be ovewritten)
ENV NAME="mistral-7b-instruct-v0.1.Q4_K_M.gguf"
ENV PATH=".venv/bin/:$PATH"
ENV PYTHONPATH="$PYTHONPATH:/private_gpt/"

# Dependencies to build llama-cpp
RUN apt update && apt install -y \
Expand All @@ -24,8 +26,10 @@ FROM base as dependencies
WORKDIR /home/worker/app
COPY pyproject.toml poetry.lock ./

RUN poetry config installer.max-workers 10
RUN poetry install --with local
RUN poetry install --with ui
RUN poetry install --extras chroma

FROM base as app

Expand All @@ -34,18 +38,21 @@ ENV PORT=8080
EXPOSE 8080

# Prepare a non-root user
RUN adduser --system worker
RUN adduser worker
WORKDIR /home/worker/app

RUN mkdir local_data; chown worker local_data
RUN mkdir models; chown worker models
RUN mkdir -p local_data; chown -R worker local_data
RUN mkdir -p models; chown -R worker models

COPY --chown=worker --from=dependencies /home/worker/app/.venv/ .venv
COPY --chown=worker private_gpt/ private_gpt
COPY --chown=worker fern/ fern
COPY --chown=worker *.yaml *.md ./
COPY --chown=worker scripts/ scripts

ENV PYTHONPATH="$PYTHONPATH:/private_gpt/"
# Copy the entry point script into the container and make it executable
COPY --chown=worker entrypoint.sh /entrypoint.sh
RUN chmod +x /entrypoint.sh

USER worker
ENTRYPOINT python -m private_gpt
# Set the entry point script to be executed when the container starts
ENTRYPOINT ["/entrypoint.sh", ".venv/bin/python", "-m", "private_gpt"]
Empty file added docs/.gitignore
Empty file.
13 changes: 13 additions & 0 deletions entrypoint.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
#!/bin/sh

# Check if the FILE_URL environment variable is set
if [ -z "$FILE_URL" ]
then
echo "Error: FILE_URL environment variable is not set."
exit 1
fi

wget -O "models/${NAME}" "${FILE_URL}"

# Execute the main container command
exec "$@"
3 changes: 2 additions & 1 deletion settings.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ embedding:
ingest_mode: simple

vectorstore:
database: qdrant
database: chroma

qdrant:
path: local_data/private_gpt/qdrant
Expand All @@ -63,3 +63,4 @@ sagemaker:
openai:
api_key: ${OPENAI_API_KEY:}
model: gpt-3.5-turbo