🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
May 25, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
OpenMMLab Detection Toolbox and Benchmark
A high-throughput and memory-efficient inference and serving engine for LLMs
OpenMMLab Semantic Segmentation Toolbox and Benchmark.
Easy-to-use Speech Toolkit including Self-Supervised Learning model, SOTA/Streaming ASR with punctuation, Streaming TTS with text frontend, Speaker Verification System, End-to-End Speech Translation and Keyword Spotting. Won NAACL2022 Best Demo Award.
Chinese version of GPT2 training code, using BERT tokenizer.
Easy-to-use image segmentation library with awesome pre-trained model zoo, supporting wide-range of practical tasks in Semantic Segmentation, Interactive Segmentation, Panoptic Segmentation, Image Matting, 3D Segmentation, etc.
A framework for few-shot evaluation of language models.
Code for the paper "Jukebox: A Generative Model for Music"
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Google AI 2018 BERT pytorch implementation
The GitHub repository for the paper "Informer" accepted by AAAI 2021.
Production First and Production Ready End-to-End Speech Recognition Toolkit
pix2tex: Using a ViT to convert images of equations into LaTeX code.
Large Language Model Text Generation Inference
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
Trax — Deep Learning with Clear Code and Speed
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
Faster Whisper transcription with CTranslate2
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."