LongT5-based model pre-trained on a large amount of unlabeled Vietnamese news texts and fine-tuned with ViMS and VMDS collections
-
Updated
Jul 23, 2024 - Python
LongT5-based model pre-trained on a large amount of unlabeled Vietnamese news texts and fine-tuned with ViMS and VMDS collections
A repository for my Diploma Thesis; "Semantic communications framework with Transformer based models"
The "LLM Projects Archive" is a centralized GitHub repository, offering a diverse collection of Language Model Models projects. A valuable resource for researchers, developers, and enthusiasts, it showcases the latest advancements and applications in the realm of LLMs. Explore and contribute to the dynamic landscape of language model projects.
As seen in TREC 2023, A QA-First, Hallucination-Lite, Multi-LM Summarizer
Python script for text summarization using the T5 model.
中文对话0.2B小模型(ChatLM-Chinese-0.2B),开源所有数据集来源、数据清洗、tokenizer训练、模型预训练、SFT指令微调、RLHF优化等流程的全部代码。支持下游任务sft微调,给出三元组信息抽取微调示例。
Code and Assets for "Benchmarking and Improving Text-to-SQL Generation Under Ambiguity" (EMNLP 2023)
This repository explores enhancing dialogue summarization with commonsense knowledge through the SICK framework, evaluating models on dialogue datasets to assess commonsense's impact on summarization quality.
Training paraphasing using huggingface T5
Developed a chatbot using Generative AI from Large Language Models (LLM). Model used is T5 (seq-2-seq)
A project to apply Abbott’s heuristics to natural language software engineering scenarios using NLP.
This repository contains the code to reproduce the results in the paper GAVI: A Category-Aware Generative Approach for Brand Value Identification.
Evaluating Large Language Models with Instructions and Prompts
This repository contains scripts to export T5 model to torchscript or onnx.
CodeT5 LLM fine-tuned with C++ code from KDE
Summarizes Wikipedia articles with transformers T5 model. Productization with Python program, fastAPI REST API and streamlit (web app).
About Code for the paper "NASH: A Simple Unified Framework of Structured Pruning for Accelerating Encoder-Decoder Language Models" (EMNLP 2023 Findings)
Official repository of Generating Multiple-Length Summaries via Reinforcement Learning for Unsupervised Sentence Summarization [EMNLP'22 Findings]
Add a description, image, and links to the t5-model topic page so that developers can more easily learn about it.
To associate your repository with the t5-model topic, visit your repo's landing page and select "manage topics."