Skip to content
Code for the paper "Language Models are Unsupervised Multitask Learners"
Branch: master
Clone or download
Pull request Compare This branch is 28 commits behind openai:master.
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.


Code and samples from the paper "Language Models are Unsupervised Multitask Learners".

For now, we have only released a smaller (117M parameter) version of GPT-2.

See more details in our blog post.


Download the model data

sh 117M

Install python packages:

pip3 install -r requirements.txt

Unconditional sample generation

WARNING: Samples are unfiltered and may contain offensive content.

To generate unconditional samples from the small model:

python3 src/ | tee samples

There are various flags for controlling the samples:

python3 src/ --top_k 40 --temperature 0.7 | tee samples

Conditional sample generation

To give the model custom prompts, you can use:

python3 src/ --top_k 40

GPT-2 samples

While we have not yet released GPT-2 itself, you can see some samples from it in the gpt-2-samples folder. We show unconditional samples with default settings (temperature 1 and no truncation), with temperature 0.7, and with truncation with top_k 40.

Future work

We may release code for evaluating the models on various benchmarks.

We are still considering release of the larger models.


Coming soon!

You can’t perform that action at this time.