Skip to content

bruvduroiu/dagster-llama-pipeline

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dagster ❤️ Llama 🦙

Toy example using Dagster orchestration in conjunction with llama.cpp to achieve a performant and observable data pipeline.

This repo is the supporting material for the blog post: Dagster ❤️ Llama - orchestration for modern LLM pipelines

Installation

pyenv local 3.9.x

poetry config virtualenvs.in-project true --local

poetry install

To run

# Start up Llama server
poetry run python -m llama_cpp.server --model ggml-model-q4_0.bin

# Start dagit
poetry run dagit -f src/__init__.py -d .

About

Orchestrate complex Llama chains using Dagster

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages