A translation based approach for sentiment analysis and intent classification of social media texts in Romanian
-
Updated
Jun 19, 2024 - Jupyter Notebook
A translation based approach for sentiment analysis and intent classification of social media texts in Romanian
BERT classification model for processing texts longer than 512 tokens. Text is first divided into smaller chunks and after feeding them to BERT, intermediate results are pooled. The implementation allows fine-tuning.
news-please - an integrated web crawler and information extractor for news that just works
The programming environment »Open Roberta Lab« by Fraunhofer IAIS enables children and adolescents to program robots. A variety of different programming blocks are provided to program motors and sensors of the robot. Open Roberta Lab uses an approach of graphical programming so that beginners can seamlessly start coding. As a cloud-based applica…
Simple and Lightweight Text Classifiers with LLM Embeddings
Cross-cultural Inspiration Detection and Analysis in Real and LLM-generated Social Media Data
2020 - [RoBERTa, BiLSTM, SageMaker] Experiments with NER
bert text classification using onnx of(bert,albert,roberta,macbert and so on).
Chinese Offensive Language Detection using onnx model
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Stylometry approach detecting writing patterns and changings using NLTK, XML-roBERTa, Gensim topic modelling and unsupervised-PCA learning
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Human Language Technologies (HLT) project. Computer Science Master Degree, University of Pisa. A.Y 2023/2024
Word embeddings are very useful representations of words that can represent semantic information. This project trains some Word2Vec embeddings, uses RoBERTa (and other embeddings) for semantic text similarity and also does text classifcation
[ICML'24 Oral] APT: Adaptive Pruning and Tuning Pretrained Language Models for Efficient Training and Inference
Large Language Model | BERT (Bidirectional Encoder Representations from Transformers)
A Mood Enhancing Web Application
This study aims to investigate the effectiveness of three Transformers (BERT, RoBERTa, XLNet) in handling data sparsity and cold start problems in the recommender system. We present a Transformer-based hybrid recommender system that predicts missing ratings and ex- tracts semantic embeddings from user reviews to mitigate the issues.
Mitigating a language model's over-confidence with NLI predictions on Multi-NLI hypotheses with random word order using PAWS (paraphrase) and Winogrande (anaphora).
Add a description, image, and links to the roberta topic page so that developers can more easily learn about it.
To associate your repository with the roberta topic, visit your repo's landing page and select "manage topics."