Skip to content

velebit-ai/stable-diffusion-tritonserver

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Stable Diffusion on Tritonserver

Download models

# clone this repo
git clone https://github.com/velebit-ai/stable-diffusion-tritonserver.git
cd stable-diffusion-tritonserver
# clone model repo from huggingface
git lfs install
git clone https://huggingface.co/kamalkraj/stable-diffusion-v1-4-onnx

Unzip the model weights

cd stable-diffusion-v1-4-onnx
tar -xvzf models.tar.gz

Triton Inference Server

Build

docker build -t tritonserver .

Run

docker run -it --rm --gpus all -p8000:8000 -p8001:8001 -p8002:8002 --shm-size 16384m   \
-v $PWD/stable-diffusion-v1-4-onnx/models:/models tritonserver \
tritonserver --model-repository /models/

Inference

Install tritonclient and run the notebook for inference.

pip install "tritonclient[http]"

Credits

About

Deploy stable diffusion model through tritonserver

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%