Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

馃殌 Dockerize dalai #39

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

bernatvadell
Copy link

I've been playing around with this repository and dockerized the app.

During the process I have refactored a few things. Always respecting the original values to avoid breaking-changes.

If you are interested in maintaining support for this feature, I would recommend publishing the image on dockerhub.

I've also included a docker-compose to ease initial startup and a script in the package.json

The only requirement is to have Docker Engine (or Docker Desktop) installed.

and execute yarn run:docker or npm run run:docker

if the image were published in dockerhub it would be as simple as running:

basic:
docker run -p 3000:3000 cocktailpeanut/dalai

specifing model:
docker run -p 3000:3000 -e LLAMA_MODEL=7B cocktailpeanut/dalai

mapping volume models to persist:
docker run -p 3000:3000 -v models:/home/dalai/models cocktailpeanut/dalai

rename home to llamaPath
add usePyEnv to specify if we want use virtualization
add config to set custom model folder, with that we keep clean llama.cpp git directory
check python path before use it.
other minor changes
@waffletower
Copy link

This is awesome! Will have to try this out!

@adampankow
Copy link

adampankow commented Mar 15, 2023

Good work on this. It worked great, minus some small issues. The models directory still ended up owned by root, hence not writable. As well, the container needed to be rerun a few times before the model properly converted. Otherwise great though! 馃憤 馃憤

@kaminskypavel
Copy link

great work @bernatvadell!

I'm using Docker Desktop on windows and the final step was crashing for me.
it seems like ./dalai has some weird newline that gives

/bin/bash^M: bad interpreter: No such file or directory

clearing it with sed -i -e 's/\r$//' ./dalai did the trick.
I suggest you update the file in this PR.

@leeola
Copy link

leeola commented Mar 17, 2023

Interesting, this failed to build for me. Error:

Step 23/26 : RUN yarn install
 ---> Running in dda9e9bf1e29
yarn install v1.22.19
[1/4] Resolving packages...
[2/4] Fetching packages...
[3/4] Linking dependencies...
error Could not write file "/home/dalai/app/yarn-error.log": "EACCES: permission denied, open '/home/dalai/app/yarn-error.log'"
error An unexpected error occurred: "EACCES: permission denied, mkdir '/home/dalai/app/node_modules'".
info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.
The command '/bin/sh -c yarn install' returned a non-zero code: 1

Fixed locally by adding a chown above the workdir, eg:

RUN mkdir /home/dalai/app && chown -R dalai:dalai /home/dalai/app
WORKDIR /home/dalai/app

ARG NODE_PACKAGE=node-v$NODE_VERSION-linux-x64
ARG NODE_HOME=/opt/$NODE_PACKAGE

ENV LLAMA_MODEL=7B

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any Reasons you are Hardcoding here the Model 7B?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not hardcoded, you can change during runtime passing env.

docker run -e LLAMA_MODEL=7B ...

by the default we're using 7B (smallest)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But then you need to add this to the docker-compose.yaml file too. At this moment, if you just say docker-compose up it uses the 7B Model

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it is possible that during the last merge, that changed the api something will stop working. I review it and I tell you things.

- ./models:/home/dalai/models
ports:
- 3000:3000
# command: tail -f /dev/null

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we add here again command: yarn just:run $LLAMA_MODEL?
When I run docker compose up the env variable in docker-compose.yaml is ignored and the hardcoded one from Dockerfile is used

@evanjrowley
Copy link

It would be really nice if the instructions for running this could also work on Fedora, which uses Podman. Compatibility of the container image should not be a problem, but the current directions rely on Docker Compose, which is not well supported by Podman.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants