🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
May 27, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search, ❓ Question Answering, ℹ️ Information Extraction, 📄 Document Intelligence, 💌 Sentiment Analysis etc.
RuTaBERT is a model solving the problem of Column Type Annotation with pre-trained large language model (BERT), trained on the Russian corpus.
🔍 LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
address participles, parsing, error recovery by named entity recognition.
fastapi server for classification of documents and extraction of data
This repo covers methodologies to utilize Pre Trained BERT model on NMT Task
Transformers 库快速入门教程
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
All NLP you Need Here. 目前包含15个NLP demo的pytorch实现(大量代码借鉴于其他开源项目,原先是自己玩的,后来干脆也开源出来)
Gateway into the John Snow Labs Ecosystem
Public release of SciFCheX system developed for COM3610 Dissertation Project. The pipeline is designed to perform fact-checking on scientific claims.
Neural Network Compression Framework for enhanced OpenVINO™ inference
Easy and Efficient Transformer : Scalable Inference Solution For Large NLP model
BERT model on CMS synthetic EHR data for diagnosis and procedure prediction in PyTorch
An elegent pytorch implement of transformers
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
MindSpore online courses: Step into LLM
Add a description, image, and links to the bert topic page so that developers can more easily learn about it.
To associate your repository with the bert topic, visit your repo's landing page and select "manage topics."