Run GPT model on the browser with WebGPU. An implementation of GPT inference in less than ~1500 lines of vanilla Javascript.
-
Updated
Jan 12, 2024 - JavaScript
Run GPT model on the browser with WebGPU. An implementation of GPT inference in less than ~1500 lines of vanilla Javascript.
This repository contains the collection of explorative notebooks pure in python and in the language that we, humans can read. Have tried to compile all lectures from the Andrej Karpathy's 💎 playlist on Neural Networks - which we will end up with building GPT.
Symbolic Music NLP Artificial Intelligence Toolkit
Code repository for the paper "Traveling Words: A Geometric Interpretation of Transformers"
The simple repository for training/finetuning medium-sized GPTs.
Fast multi-instrumental music transformer with 4k sequence length, pentagram full-range MIDI notes encoding, notes counters and outro tokens
We trained nanoGPT from scratch to have emotional support response to the user
Training karpathy/nanoGPT on A. Puhskin's poems text corpus
Arabic Nano GPT Trained on Arabic Wikipedia Dataset from Wikimedia
The simplest, fastest repository for training/finetuning medium-sized GPTs. @googlecolab+finetuning
Lo, behold this wondrous contrivance—a digital bard, imbued with the spirit of Shakespeare! With quill of code and soul of verse, it doth craft sonnets, soliloquies, and merry jests, as if the Bard himself whispered each line in thy ear.
The simplest, fastest repository for training/finetuning medium-sized GPTs - Verbose Version for ELI5
Add a description, image, and links to the nanogpt topic page so that developers can more easily learn about it.
To associate your repository with the nanogpt topic, visit your repo's landing page and select "manage topics."