LLM transformer Classifier with DistilBERT model
-
Updated
Jun 19, 2024 - Python
LLM transformer Classifier with DistilBERT model
This project classifies Internet Hinglish memes using multimodal learning. It combines text and image analysis to categorize memes by sentiment and emotion, leveraging the Memotion 3.0 dataset.
Successfully fine-tuned a pretrained DistilBERT transformer model that can classify social media text data into one of 4 cyberbullying labels i.e. ethnicity/race, gender/sexual, religion and not cyberbullying with a remarkable accuracy of 99%.
This project is designed to streamline the recruitment process by providing a job and resume matching system and a chatbot for applicants. The key functionalities include: Job and Resume Matching and LLM powered chatbot
🗨️ This repository contains a collection of notebooks and resources for various NLP tasks using different architectures and frameworks.
This project involves analyzing and classifying the BoolQ dataset from the SuperGLUE benchmark. We implemented various classifiers and techniques, including rules-based logic, BERT, RNN, and GPT-3/4 data augmentation, achieving performance improvements.
Successfully developed a fine-tuned DistilBERT transformer model which can accurately predict the overall sentiment of a piece of financial news up to an accuracy of nearly 81.5%.
Deep learning for Natural Language Processing
Advanced NLP with Contextual Question Answering: This notebook extracts, cleans, and processes text data from multiple files. It utilizes transformer models for contextual question answering and sentence generation. Perfect for exploring cutting-edge NLP techniques and comparing transformer model performances.
Classification, ADSA and Text Summarisation based project for BridgeI2I Task at Inter IIT 2021 Competition. Silver Medalists.
The data and code for my master's thesis for the MA Digital Text Analysis at the University of Antwerp
This repository contains a DistilBERT model fine-tuned using the Hugging Face Transformers library on the IMDb movie review dataset. The model is trained for sentiment analysis, enabling the determination of sentiment polarity (positive or negative) within text reviews.
Using BERT models to perform sentiment analysis on women's clothing
Thesis Project
Aim to build a question- answering product that can that can understand the information in these articles and answer some simple questions related to those articles.
The official repository for the PSYCHIC model
This app searches reddit posts and comments to determine if a product or service has a positive or negative sentiment and predicts top product mentions using Named Entity Recognition
Implemented pre-trained Transformer-based distilBERT and BERT multilingual model to classify sentiments in positive or negative class and ranked them on scale of 1 to 5
Finetune the Transformer model 'DistilBERT' with PyTorch framework . Then inference on a dataset by using this fine-tuned model with the help of Pipeline.
This project centers on elevating customer satisfaction by conducting sentiment analysis on customer feedback for an online classes and video conferencing app. The aim is to decipher customer sentiments in their feedback, extract insights, and improve user experience while addressing any concerns.
Add a description, image, and links to the distilbert-model topic page so that developers can more easily learn about it.
To associate your repository with the distilbert-model topic, visit your repo's landing page and select "manage topics."