Skip to content
/ haikoo Public

Notes and dataset from the paper "Haiku Generation A Transformer Based Approach With Lots Of Control"

Notifications You must be signed in to change notification settings

zkg/haikoo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

haikoo

Notes and dataset from the paper "Haiku Generation A Transformer Based Approach With Lots Of Control"

In "Fine-tuning GPT-2 on Haikus.ipynb" you'll find everything your need to download a pretrained GPT-2 model and fine-tune it using Hugging Face. "haiku_utils.ipynb" contains some eval functions and utilities to quickly conjure a bag of words on a given domain.

Head over to PPLM and setup their code. Once you are able to run their demo successfully, simply run:

python run_pplm.py -B bag_of_words --cond_text "Starting text" --length 50 --gamma 1.5 --num_iterations 3 --num_samples 10 --stepsize 0.03 --window_length 5 --kl_scale 0.01 --gm_scale 0.99 --colorama --sample --pretrained_model "/home/username/haiku/gpt2-haiku/" --seed $(shuf -i 1-999 -n 1)

Make sure to replace bag_of_words with whatever file you wish to link (just look into PPLM-master/paper_code/wordlists); Starting text with whatever prompt you wish to initiate your poem with. And of course, make sure to point pretrained_model to the relevant directory. Please refer to the paper for additional considerations, especially about stepsize.

Hopefully you should get some inspiration.

About

Notes and dataset from the paper "Haiku Generation A Transformer Based Approach With Lots Of Control"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published