Free-Range, Organic, Hand-Crafted.
step 67000, loss: 1.2239, it/s: 0.7:
To be, and be of men?
Prown AMEN:
O yout aboars of
Ra':
Un
step 77000, loss: 1.0891, it/s: 0.8:
To be, fo hend!
First her sense ountier to Jupits,
be horse.
Wiser words have never been spoken. Trained on an M2 Air 16GB for... a while, idk. Be horse.
install dependencies via uv
uv syncadd training corpus (single .txt file) in /data and call it input.txt. For example, the tiny Shakespeare dataset:
curl -o data/input.txt https://raw.githubusercontent.com/karpathy/char-rnn/master/data/tinyshakespeare/input.txtModels are saved in checkpoints/checkpoint.pt by default.
Old models are overwritten during training.
uv run train --device cuda (or mps/cpu)uv run sample --query "To be, "uv run export-onnx --checkpoint checkpoints/checkpoint