Weak Supervised Fake News Detection with RoBERTa, XLNet, ALBERT, XGBoost and Logistic Regression classifiers.
-
Updated
Jun 8, 2021 - Python
Weak Supervised Fake News Detection with RoBERTa, XLNet, ALBERT, XGBoost and Logistic Regression classifiers.
Training the first Cypriot Large Language Model on the Masked Language Modeling objective for predicting a given masked word token within a given context
Data enrichment with experimental results of the paper 'Two is Better than Many? Binary Classification as an Effective Approach to Multi-Choice Question Answering'
Demo application for predicting Chula faculty from Thai course description
Just exploring NLP with 🤗 Transformers
基于tensorflow2.x实现bert及其变体的预训练模型加载架构
Time perception analysis for borderline personality disorder patients
Chinese Offensive Language Detection using onnx model
Radiology reports structuring and knee cartilage disease grade classification with BERT.
BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based method of learning language representations. It is a bidirectional transformer pre-trained model developed using a combination of two tasks namely: masked language modeling objective and next sentence prediction on a large corpus.
Relation Extraction Competition w/ KLUE Dataset @ Naver Boostcamp AI-Tech
An implementation of transfer learning of state of the art transformer language models using hugging-face and pytorch.
Which article should I use? Use AI to help you out.
A package built on top of Hugging Face's transformers library that makes it easy to utilize state-of-the-art NLP models
Code for team art.-nat.'s submission to SemEval 2024 task 8 on multi-domain, multi-generator human vs. machine generated text classification.
Implementation of the semi-structured inference model in our ACL 2020 paper. INFOTABS: Inference on Tables as Semi-structured Data
Add a description, image, and links to the roberta topic page so that developers can more easily learn about it.
To associate your repository with the roberta topic, visit your repo's landing page and select "manage topics."