Skip to content

code for fine tuning LLaMA2 7B on python codebase

Notifications You must be signed in to change notification settings

pratikshappai/pyllama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ece8803-gdl-project

Repository to finetune LLaMa2 7 Billion parameter model to generate python code from natural language description of the code.

Setup

We have a GPU requirement of 1x NVIDIA A100 40GB GPU, and 32GB of RAM.

To setup the project, follow the steps below:

  1. Clone the repository
  2. Install the requirements
pip install -r requirements.txt
  1. We use python code instruction dataset from here
  2. Our Base model s a chat model of LLaMa-2 7B by NousRearch
  3. Run the following command to finetune the model
python finetune.py
  1. Run the following command to generate code from natural language description
python generate.py

Results & Documentation

  • Some results snippets can be found in the results directory.
  • Some training graphs can be found in the training_graphs directory.
  • The report and slides for the project can be found in the repository.

About

code for fine tuning LLaMA2 7B on python codebase

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages