Skip to content

apollohuang1/Dr.llama

 
 

Repository files navigation

Dr. Llama: Improving Small Language Models Through Generative Data Augmentation

This repository contains the code, data, and resources related to our project titled "Dr. Llama: Improving Small Language Models Through Generative Data Augmentation."

Table of Contents

Introduction

In this project, we aim to improve the performance of small language models (SLMs) through the use of generative data augmentation. Specifically, we introduce Dr. Llama, a framework for generating new training data to improve the accuracy of SLMs on various natural language processing tasks. Our approach involves training a large language model (LLM) on a large corpus of text data and then using this model to generate new training examples that are similar in style and structure to the original data. We evaluate our approach on several NLP datasets and demonstrate that Dr. Llama can improve the performance of SLMs on a range of tasks.

Installation

The installation instructions for the project should be provided here. Users should be able to easily set up the project on their local machine or server by following the steps listed in this section.

Usage

The usage section should provide instructions on how to use the project, including sample code and examples. Users should be able to run the project and understand its basic functionality after reading this section.

Contributing

Contributions to the project are always welcome! Please read the contributing guidelines before getting started.

Citation

If you find this project helpful for your research, please cite our work using the following bibtex entry:

@inproceedings{2023drllama,
title={Dr. Llama: Improving Small Language Models in Domain-Specific QA via Generative Data Augmentation},
author={[Zhen Guo] and [Peiqi Wang] and [Yanwei Wang] and [Shangdi Yu]},
url={https://github.com/zguo0525/Dr.llama}
year={2023}
}

License

This project is licensed under the MIT License.

Contact

For any questions or comments, please feel free to reach out to the corresponding author:

  • [Name of Corresponding Author]
  • Email: [email address of the corresponding author]

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 92.2%
  • Python 7.8%