🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
May 22, 2025 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Natural Language Processing Tutorial for Deep Learning Researchers
🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP
Easy-to-use and powerful LLM and SLM library with awesome model zoo.
This repository contains demos I made with the Transformers library by HuggingFace.
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
Google AI 2018 BERT pytorch implementation
Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.
Transformer related optimization, including BERT, GPT
Tensorflow solution of NER task Using BiLSTM-CRF model with Google BERT Fine-tuning And private Server services
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Must-read papers on prompt-based tuning for pre-trained language models.
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Add a description, image, and links to the bert topic page so that developers can more easily learn about it.
To associate your repository with the bert topic, visit your repo's landing page and select "manage topics."