Skip to content

shredstack/llm-lab

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

6 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

LLM Lab

A repo for building, fine-tuning, and deploying custom LLMs for domain-specific tasks.

πŸ§ͺ Projects

Project Description Status
fuelrx-llm Meal plan generation for FuelRx 🚧 In Progress

πŸ—οΈ Repository Structure

llm-lab/
β”œβ”€β”€ README.md                # This file
β”œβ”€β”€ fuelrx-llm/              # Meal planning LLM project
β”‚   β”œβ”€β”€ notebooks/           # Exploration & learning
β”‚   β”œβ”€β”€ data/                # Training data pipeline
β”‚   β”œβ”€β”€ training/            # Fine-tuning scripts
β”‚   β”œβ”€β”€ inference/           # Production inference
β”‚   └── evaluation/          # Benchmarking suite
β”œβ”€β”€ [future-project]/        # Next domain-specific LLM
└── shared/                  # (future) Common utilities

🎯 Philosophy

Each project in this lab follows a similar pattern:

  1. Explore β€” Understand base model capabilities via notebooks
  2. Extract β€” Build training data from existing systems
  3. Evaluate β€” Create domain-specific benchmarks
  4. Fine-tune β€” Train with LoRA for efficiency
  5. Deploy β€” Ship to HuggingFace Endpoints

πŸš€ Getting Started

Each project is self-contained with its own dependencies and setup instructions. Navigate to a project directory and follow its README:

cd fuelrx-llm
cat README.md
  1. Copy the example and fill in your values:
cp fuelrx-llm/.env.example fuelrx-llm/.env
  1. Rebuild the Docker image:
cd fuelrx-llm

docker build -t fuelrx-llm-dev -f docker/Dockerfile.dev .
  1. Start container with bash shell:
docker run -it \
  -v $(pwd):/workspace \
  --env-file .env \
  fuelrx-llm-dev /bin/bash

# ---------------------------------
# OR if you already have the container running, you can exec into it:
# Find container ID
docker ps

# Exec into it
docker exec -it <container_id> /bin/bash

# ---------------------------------
# OR start the container without immediate access to its terminal to just play with jupyter notebooks:
docker run -it \
  -v $(pwd):/workspace \
  -p 8888:8888 \
  fuelrx-llm-dev

Extracting llm logs from FuelRx

Inside the running container:

cd data
python extract_from_llm_logs.py --output training_data/meals.jsonl

To exit out and stop the container:

exit

# List running containers
docker ps

# Stop a container by name or ID
docker stop <container_name_or_id>

πŸ“¦ Shared Components (Planned)

As patterns emerge across projects, common code will be extracted to shared/:

  • Evaluation harnesses
  • Training utilities
  • Deployment scripts
  • Schema validation helpers

πŸ“ Adding a New Project

  1. Create a new directory: mkdir my-new-llm
  2. Copy the structure from an existing project as a starting point
  3. Update this README's project table
  4. Customize for your domain

πŸ”— Resources

About

Custom LLMs

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published