Skip to content

Hands-on exploration of how small language models work ,from n-grams to transformers built with Keras as part of the Google DeepMind course “Build Your Own Small Language Model.” Covers tokenization, model training, evaluation, and the ethics of AI innovation.

Notifications You must be signed in to change notification settings

aee4/Small-Language-Model

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 

Repository files navigation

Small-Language-Model

Hands-on exploration of how small language models work ,from n-grams to transformers built with Keras as part of the Google DeepMind course “Build Your Own Small Language Model.” Covers tokenization, model training, evaluation, and the ethics of AI innovation.

About

Hands-on exploration of how small language models work ,from n-grams to transformers built with Keras as part of the Google DeepMind course “Build Your Own Small Language Model.” Covers tokenization, model training, evaluation, and the ethics of AI innovation.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published