🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
Jun 2, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
A Convolutional Transformer to decode mental states from Electroencephalography (EEG) for Brain-Computer Interfaces (BCI)
《李宏毅深度学习教程》(李宏毅老师推荐👍),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
A high-throughput and memory-efficient inference and serving engine for LLMs
Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning, evaluating and serving LLMs in JAX/Flax.
Seamlessly integrate state-of-the-art transformer models into robotics stacks
Transformer Bitnet en Verilog
Official PyTorch implementation of the CVPR 2024 paper: State Space Models for Event Cameras (Spotlight).
MetaFormer Baselines for Vision (TPAMI 2024)
This is a JAX/Flax-based transformer language model trained on a Japanese dataset. It is based on the official Flax example code (lm1b).
Scalable and user friendly neural 🧠 forecasting algorithms.
Auto-regressive causal language model for molecule (SMILES) and reaction template (SMARTS) generation based on the Hugging Face implementation of OpenAI's GPT-2 transformer decoder model
PoolFormer: MetaFormer Is Actually What You Need for Vision (CVPR 2022 Oral)
Lumina-T2X is a unified framework for Text to Any Modality Generation
A typescript transformer that automatically generates validation code from your types.
Transformer Balance Research
A PyTorch implementation of the vanilla Transformer.
Some natural language processing networks from scratch in PyTorch for personal educational purposes.
Transformer Architectures Comparison in Natural Language Generation Tasks
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."