Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
-
Updated
Nov 9, 2024 - Python
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
BitNet: Learning-Based-Bit-Depth-Expansion
Running Microsoft's BitNet inference framework via FastAPI, Uvicorn and Docker.
Distily: Language Model Distillation Toolkit and Library
Long term project about a custom AI architecture. Consist of cutting-edge technique in machine learning such as Flash-Attention, Group-Query-Attention, ZeRO-Infinity, BitNet, etc.
This is the repo for the MixKABRN Neural Network (Mixture of Kolmogorov-Arnold Bit Retentive Networks), and an attempt at first adapting it for training on text, and later adjust it for other modalities.
Add a description, image, and links to the bitnet topic page so that developers can more easily learn about it.
To associate your repository with the bitnet topic, visit your repo's landing page and select "manage topics."