Help regarding loading custom trained models #81
Replies: 1 comment
-
For example: # make sure that you have git lfs installed
git lfs install
git clone https://huggingface.co/BAAI/bge-base-en-v1.5
# mount the model directory to /model inside the container and pass it to the `--model-id` CLI arg
docker run --gpus all -p 8080:80 -v $PWD/bge-base-en-v1.5:/model --pull always ghcr.io/huggingface/text-embeddings-inference:0.4.0 --model-id /model |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, I need documentation/examples for loading models using the path of the local directory. It is looking for a URL when passed through --model-id, is there any additional tag to pass as an argument?
Beta Was this translation helpful? Give feedback.
All reactions