🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
Jun 9, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Cybertron: the home planet of the Transformers in Go
Easy multi-task learning with HuggingFace Datasets and Trainer
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Research and Materials on Hardware implementation of Transformer Model
Train transformer-based models.
🔍 LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
A Unified Library for Parameter-Efficient and Modular Transfer Learning
Minimal keyword extraction with BERT
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
Generalist and Lightweight Model for Relation Extraction (Extract any relationship types from text)
A model for response quality classification
All NLP you Need Here. 目前包含15个NLP demo的pytorch实现(大量代码借鉴于其他开源项目,原先是自己玩的,后来干脆也开源出来)
Neural Network Compression Framework for enhanced OpenVINO™ inference
Annotations of the interesting ML papers I read
Exploring the impact of contextual attention on Arabic text classification: This study examines how contextual attention, such as that implemented in transformers, influences the performance of generative models for Arabic text classification, by analyzing attention mechanisms and their usefulness.
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search, ❓ Question Answering, ℹ️ Information Extraction, 📄 Document Intelligence, 💌 Sentiment Analysis etc.
Transformers 3rd Edition
Add a description, image, and links to the bert topic page so that developers can more easily learn about it.
To associate your repository with the bert topic, visit your repo's landing page and select "manage topics."