Skip to content

EMNLP-LAD/LAD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 

Repository files navigation

LAD: Length-Adaptive Distillation

This repo provides the implementation of our work Length-Adaptive Distillation: Customizing Small Language Model for Dynamic Token Pruning published in Findings of EMNLP 2023.

Our implementation is mainly based on transformers. We using the same data augmentation code provided by TinyBERT. We following LAT to calculate the speedup ratio.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published