Skip to content

Speed benchmark of common attention-XL implementations

Notifications You must be signed in to change notification settings

davide97l/attention-benchmark

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Transformer Attention Benchmark

The goal of this repository is to benchmark the speed of different attention-XL implementations. Current implementations include:

Benchmark results

Run

Clone this repository

git clone https://github.com/davide97l/attention-benchmark
cd attention-benchmark

Install DI-engine

git clone https://github.com/opendilab/DI-engine
cd DI-engine
pip install -e .

Install Huggingface

git clone https://github.com/huggingface/transformers
cd transformers
pip install -e .

Install Labmlai

git clone https://github.com/labmlai/annotated_deep_learning_paper_implementations
cd annotated_deep_learning_paper_implementations
pip install -e .

Other implementations don't have any intra-framework dependency, thus it is not necessary to install their framework, but the single runnable file is immediately available.

About

Speed benchmark of common attention-XL implementations

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages