Self-trained GPT (Generative Pre-trained Transformer) with ~1M parameters that has been trained on Shakespeare texts using a character-level tokenizer. This is a simple model trained and inferred locally on Macbook Air M1 for educational purposes. Using more compute from faster GPUs can allow the training of a larger model that is capable of producing very accurate texts.
Output (500 tokens):-
BUCKIO: Gho, my sen oather to fairtly to heaven.
RICHMIO: For his trancly you our behour well, God that no more won, I he'll have, and kine; On is make drath's blastick of rag, Eve by the thy king appeasd my dayst, Asidier at I had thy tendenty heave than shows as protiment? therefor sungrangiess it buchmons fork'd.
ROMEO: AHow safter, or it to wack; to not had mournanate: But why, if was Marcolies lords, and we that Cotizer.
MENENIUS: Whold, captilanus, Our seestling Duke you saw a polk-'tin