news-please - an integrated web crawler and information extractor for news that just works
-
Updated
Jun 21, 2024 - Python
news-please - an integrated web crawler and information extractor for news that just works
BERT classification model for processing texts longer than 512 tokens. Text is first divided into smaller chunks and after feeding them to BERT, intermediate results are pooled. The implementation allows fine-tuning.
Simple and Lightweight Text Classifiers with LLM Embeddings
Chinese Offensive Language Detection using onnx model
Stylometry approach detecting writing patterns and changings using NLTK, XML-roBERTa, Gensim topic modelling and unsupervised-PCA learning
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
[ICML'24 Oral] APT: Adaptive Pruning and Tuning Pretrained Language Models for Efficient Training and Inference
Time perception analysis for borderline personality disorder patients
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
A project demonstrating the use of Large Language Models (LLMs) for text classification using the RoBERTa model.
Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite" (BEA-20) and "Text Simplification by Tagging" (BEA-21)
Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
Simple State-of-the-Art BERT-Based Sentence Classification with Keras / TensorFlow 2. Built with HuggingFace's Transformers.
🤖 A PyTorch library of curated Transformer models and their composable components
ZaBantu is a fleet of light-weight Masked Language Models for Southern Bantu Languages
Code for team art.-nat.'s submission to SemEval 2024 task 8 on multi-domain, multi-generator human vs. machine generated text classification.
Implementation of our ACL 2020 paper: Structured Tuning for Semantic Role Labeling
Add a description, image, and links to the roberta topic page so that developers can more easily learn about it.
To associate your repository with the roberta topic, visit your repo's landing page and select "manage topics."