Skip to content

northsky0307/DPPA-Pruning-Method-for-Large-Language-Model-to-Model-Merging

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DPPA: Pruning Method for Large Language Model to Model Merging

The code for our paper (paper)

Directory

Usage

Please change cache in "run.sh" to the path of the project.

Requirements
  1. python 3.10
  2. cuda 12.1
  3. torch 2.1.0
Installation
conda create -n DPPA python=3.10
conda activate DPPA
conda install nvidia/label/cuda-12.1.0::cuda
pip install -r requirements.txt
pip install torch==2.1.0+cu121 --index-url https://download.pytorch.org/whl/cu121

Document

DPPA 
├── README.md
├── /model/
│  ├── /llama2-7B/
│  ├── /Abel-7B-001/
│  ├── /other SFT models/
├── /gair_abel/
├── /scrips/
│  ├── /pruning/
├── run.sh
├── requirements.txt

Acknowledgement

This repository is build upon the OWL repositories.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published