MeGPT allows you to fine-tune a large language model on your own messages, enabling you to talk to yourself.
This repo contains code for:
- Extracting your iMessage conversations from your Mac
- Fine-tuning a large language model on your messages
- Generating completions using the fine-tuned model
This is a sample repo that trains Meta AI's OPT 1.3b model with Parallel Efficient Fine-tuning (PEFT) on your iMessage conversations. You can use this repo as a starting point for fine-tuning other models on your own data.
Based off of example code from lvwerra/trl
- Install the requirements:
pip install -r requirements.txt
- On your Mac, run
extract_messages.py
to extract your iMessage conversations and save them to a CSV file:
python extract_messages.py
- Configure
fine_tune.py
with your desired model, input CSV, and other settings. For example:
model_name = "facebook/opt-1.3b"
block_size = 128
input_csv = "messages.csv"
To see the full list of supported models, visit PEFT Models Support Matrix.
- Train the model on your messages using fine_tune.py:
python fine_tune.py
- Generate completions using the fine-tuned model with generate.py:
python generate.py