Skip to content

Self-trained GPT (Generative Pre-trained Transformer) that has been trained on Shakespeare texts using a character-level tokenizer

License

Notifications You must be signed in to change notification settings

AyushGupta235/SP-GPT

Repository files navigation

SP-GPT

Self-trained GPT (Generative Pre-trained Transformer) with ~1M parameters that has been trained on Shakespeare texts using a character-level tokenizer. This is a simple model trained and inferred locally on Macbook Air M1 for educational purposes. Using more compute from faster GPUs can allow the training of a larger model that is capable of producing very accurate texts.

Output (500 tokens):-

BUCKIO: Gho, my sen oather to fairtly to heaven.

RICHMIO: For his trancly you our behour well, God that no more won, I he'll have, and kine; On is make drath's blastick of rag, Eve by the thy king appeasd my dayst, Asidier at I had thy tendenty heave than shows as protiment? therefor sungrangiess it buchmons fork'd.

ROMEO: AHow safter, or it to wack; to not had mournanate: But why, if was Marcolies lords, and we that Cotizer.

MENENIUS: Whold, captilanus, Our seestling Duke you saw a polk-'tin

About

Self-trained GPT (Generative Pre-trained Transformer) that has been trained on Shakespeare texts using a character-level tokenizer

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published