Turkish BERT/DistilBERT, ELECTRA and ConvBERT models
-
Updated
Nov 16, 2024 - Python
Turkish BERT/DistilBERT, ELECTRA and ConvBERT models
Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
Sentiment analysis neural network trained by fine-tuning BERT, ALBERT, or DistilBERT on the Stanford Sentiment Treebank.
Pytorch-Named-Entity-Recognition-with-transformers
Simple State-of-the-Art BERT-Based Sentence Classification with Keras / TensorFlow 2. Built with HuggingFace's Transformers.
Distillation of BERT model with catalyst framework
FoodBERT: Food Extraction with DistilBERT
Topic clustering library built on Transformer embeddings and cosine similarity metrics.Compatible with all BERT base transformers from huggingface.
Distilling Task-Specific Knowledge from Teacher Model into BiLSTM
The Codebase for Causal Distillation for Language Models (NAACL '22)
A python library to classify dialogue tag.
finetune with keras
An automated solution for fact-checking using available claims and fake-news datasets to fine-tune state-of-the-art language models published recently for NLP tasks (BERT, RoBERTa, XLNet, ConvBERT...) in order to classify unseen claims.
Classify international patents into one of eight categories based on the text of their titles & abstracts using DistilBert & ONNX Runtime
HLE-UPC at SemEval-2021 Task 5: Toxic Spans Detection
Reddit bot that detects Hindi-English code-mixed hate speech in comments in real-time, replies with 3 warnings and then permanently bans user
Re-Evaluating GermEval 2017: Document-Level and Aspect-Based Sentiment Analysis Using Pre-Trained Language Models
Q&A System using BERT and Faiss Vector Database
Advanced RAG pipeline using Re-Ranking after initial retrieval
Text completion with Hugging Face and TensorFlow.js running on Node.js
Add a description, image, and links to the distilbert topic page so that developers can more easily learn about it.
To associate your repository with the distilbert topic, visit your repo's landing page and select "manage topics."