Opinionated utilities for quick ML and deep learning experiments
A personal collection of reusable utilities designed to streamline machine learning and deep learning experimentation workflows. This package provides ready-to-use components for common ML tasks, with built-in best practices and sensible defaults.
simple_cls_train_v1: Complete training loop for classification tasks with:- Automatic device detection (CUDA/CPU)
- Cosine annealing learning rate scheduling
- Built-in Weights & Biases integration for experiment tracking
- Periodic evaluation and logging
- Progress tracking with tqdm
- Exponentially weighted moving averages for metrics
CIFAR_for_torch: Pre-configured CIFAR-10 dataset with:- Standard normalization (channel-wise mean/std)
- Data augmentation for training (ColorJitter, horizontal flip, rotation)
- Ready-to-use PyTorch DataLoaders
ApproximateSlidingAverage: Efficient exponentially weighted moving average for metric tracking
pip install ml-experiment-utilsOr with uv:
uv add ml-experiment-utilsimport torch.nn as nn
from ml_experiment_utils.experiments.data import CIFAR_for_torch
from ml_experiment_utils.experiments.train_loops import simple_cls_train_v1
# Load data
train_loader, test_loader = CIFAR_for_torch(batch_size=128)
# Define your model
model = nn.Sequential(
nn.Flatten(),
nn.Linear(3*32*32, 512),
nn.ReLU(),
nn.Linear(512, 10)
)
# Train with built-in logging and evaluation
simple_cls_train_v1(
model=model,
epochs=10,
eval_steps=100,
train_loader=train_loader,
test_loader=test_loader,
lr=5e-4,
name="my-experiment"
)from ml_experiment_utils.experiments.data import CIFAR_for_torch
# Get pre-configured CIFAR-10 loaders
train_loader, test_loader = CIFAR_for_torch(
batch_size=64,
root="./my_data"
)
# Start training immediately
for images, labels in train_loader:
# Your training code here
pass- PyTorch (with torchvision)
- Weights & Biases
- tqdm
This package is designed with the following principles:
- Opinionated but flexible: Sensible defaults that work well for most cases, but configurable when needed
- Batteries included: Everything you need to start experimenting quickly
- Experiment tracking first: Built-in W&B integration for reproducibility
- Minimal boilerplate: Focus on model architecture, not training infrastructure
This package is ideal for:
- Quick prototyping of ML models
- Educational projects and learning
- Baseline implementations for research
- Rapid iteration on model architectures
This is primarily a personal utility package, but suggestions and improvements are welcome! Feel free to open an issue or submit a pull request.
MIT License - see LICENSE file for details.