Skip to content

This repository features a low parameter MoE model made from scratch

Notifications You must be signed in to change notification settings

peytontolbert/tinylm

Repository files navigation

Simple implementation of mixture of experts transformer in pytorch. Model trained on personal dataset with okayish responses.

Training script - train.py Finetune script - finetune.py MoE implementation - utils.py test out model - httpchatbot.py or eval.py

About

This repository features a low parameter MoE model made from scratch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages