Hands-on exploration of how small language models work ,from n-grams to transformers built with Keras as part of the Google DeepMind course “Build Your Own Small Language Model.” Covers tokenization, model training, evaluation, and the ethics of AI innovation.
- 
                Notifications
    You must be signed in to change notification settings 
- Fork 0
Hands-on exploration of how small language models work ,from n-grams to transformers built with Keras as part of the Google DeepMind course “Build Your Own Small Language Model.” Covers tokenization, model training, evaluation, and the ethics of AI innovation.
aee4/Small-Language-Model
About
Hands-on exploration of how small language models work ,from n-grams to transformers built with Keras as part of the Google DeepMind course “Build Your Own Small Language Model.” Covers tokenization, model training, evaluation, and the ethics of AI innovation.
Resources
Stars
Watchers
Forks
Releases
No releases published
              Packages 0
        No packages published