The goal of this repository is to benchmark the speed of different attention-XL implementations. Current implementations include:
Clone this repository
git clone https://github.com/davide97l/attention-benchmark
cd attention-benchmark
Install DI-engine
git clone https://github.com/opendilab/DI-engine
cd DI-engine
pip install -e .
Install Huggingface
git clone https://github.com/huggingface/transformers
cd transformers
pip install -e .
Install Labmlai
git clone https://github.com/labmlai/annotated_deep_learning_paper_implementations
cd annotated_deep_learning_paper_implementations
pip install -e .
Other implementations don't have any intra-framework dependency, thus it is not necessary to install their framework, but the single runnable file is immediately available.