Fine-tune Gemma-4 locally on distilled persona data — self-contained persona models for phones and personal computers
-
Updated
Apr 14, 2026 - Python
Fine-tune Gemma-4 locally on distilled persona data — self-contained persona models for phones and personal computers
Universal persona distillation skill for OpenPersona — distill any person or character into a runnable skill pack
Persistent, searchable persona knowledge base — MemPalace + Knowledge Graph + Karpathy LLM Wiki. Data layer for OpenPersona persona training pipeline.
Add a description, image, and links to the openpersona topic page so that developers can more easily learn about it.
To associate your repository with the openpersona topic, visit your repo's landing page and select "manage topics."