Instruction fine tuning BART for Dialogue Summarization | IT4772E | NLP Project 20232
-
Updated
Jun 18, 2024 - Python
Instruction fine tuning BART for Dialogue Summarization | IT4772E | NLP Project 20232
Project based on PyTorch-lightning and Transformers for training Seq2SeqLM models, with a primary focus on MT5 and FLAN-T5, yet not limited to them
This repository contains my team's internship project work at Flexbox Technologies. We have developed a system that fills the patient details form automatically with the patient data extracted from pdf file.
This repository contains one of my cool project which I have created during my college's MINeD hack-a-thon.
📣 Forth task on CodSoft Internship During my internship at CodSoft, I had the opportunity to develop a Spam SMS Detection model. The goal was to create a machine-learning solution to classify SMS messages as "ham" or "spam" automatically.
Tutorial para treino de um modelo baseado Flan-T5 usando Flax no GCP-TPU
The official fork of THoR Chain-of-Thought framework, enhanced and adapted for Emotion Cause Analysis (ECAC-2024)
Naive Bayes Model: Cleaned data, engineered features, and achieved impressive results. FLAN-T5 Large Language Model: Explored zero-shot, one-shot, and few-shot inference techniques.
A preliminary investigation for ontology alignment (OM) with large language models (LLMs).
Document Summarization App using large language model (LLM) and Langchain framework. Used a pre-trained T5 model and its tokenizer from Hugging Face Transformers library. Created a summarization pipeline to generate summary using model.
Symbol Team model for PAN@AP 2023 shared task on Profiling Cryptocurrency Influencers with Few-shot Learning
A gradio frontend for Google's Flan-T5 Large language model, can also be adjusted for other sizes.
Rethinking Negative Instances for Generative Named Entity Recognition
The TABLET benchmark for evaluating instruction learning with LLMs for tabular prediction.
Fine-tuning of Flan-5T LLM for text classification
Add a description, image, and links to the flan-t5 topic page so that developers can more easily learn about it.
To associate your repository with the flan-t5 topic, visit your repo's landing page and select "manage topics."