Documentation: Stable, Nightly | Install: Linux, macOS, Windows, From Source | Contribute: Guidelines
fairseq2 is a sequence modeling toolkit that allows researchers and developers to train custom models for translation, summarization, language modeling, and other content generation tasks. It is also the successor of fairseq.
- An implementation of Mistral 7B and Mistral 7B instruct (arXiv) models with Grouped-Query Attention and Sliding Window Attention. Check out the terminal-based interactive demo chat application under recipes.
- An interactive terminal-based demo chat application for LLaMA 7B Chat with system prompt support.
- A new, unified, and efficient sequence generation API for both decoder and encoder-decoder models with Beam Search, TopK Sampling, and TopP (a.k.a. Nucleus) Sampling along with toxicity prevention features.
- Support for PyTorch SDPA/Flash Attention in Relative Position SDPA and Shaw Relative Position SDPA.
- Lazy padding mask and attention mask initialization for more efficient integration with fused SDPA implementations.
- A new sampling operator in our C++-based data pipeline API.
You can find our full documentation including tutorials and API reference here.
For recent changes, you can check out our changelog.
As of today, the following models are available in fairseq2:
fairseq2 is also used by various external projects such as:
fairseq2 depends on libsndfile, which can be installed via the system package manager on most Linux distributions. For Ubuntu-based systems, run:
sudo apt install libsndfile1
Similarly, on Fedora, run:
sudo dnf install libsndfile
For other Linux distributions, please consult its documentation on how to install packages.
To install fairseq2 on Linux x86-64, run:
pip install fairseq2
This command will install a version of fairseq2 that is compatible with PyTorch hosted on PyPI.
At this time, we do not offer a pre-built package for ARM-based systems such as Raspberry PI or NVIDIA Jetson. Please refer to Install From Source to learn how to build and install fairseq2 on those systems.
Besides PyPI, fairseq2 also has pre-built packages available for different PyTorch and CUDA versions hosted on FAIR's package repository. The following matrix shows the supported combinations.
fairseq2 | PyTorch | Python | Variant* | Arch |
---|---|---|---|---|
HEAD |
2.3.0 |
>=3.8 , <=3.11 |
cpu , cu118 , cu121 |
x86_64 |
2.2.0 , 2.2.1 , 2.2.2 |
>=3.8 , <=3.11 |
cpu , cu118 , cu121 |
x86_64 |
|
2.1.0 , 2.1.1 , 2.1.2 |
>=3.8 , <=3.11 |
cpu , cu118 , cu121 |
x86_64 |
|
0.2.0 |
2.1.1 |
>=3.8 , <=3.11 |
cpu , cu118 , cu121 |
x86_64 |
2.0.1 |
>=3.8 , <=3.11 |
cpu , cu117 , cu118 |
x86_64 |
|
1.13.1 |
>=3.8 , <=3.10 |
cpu , cu116 |
x86_64 |
* cuXYZ refers to CUDA XY.Z (e.g. cu118 means CUDA 11.8)
To install a specific combination, first follow the installation instructions on
pytorch.org for the desired PyTorch
version, and then use the following command (shown for PyTorch 2.3.0
and
variant cu118
):
pip install fairseq2\
--extra-index-url https://fair.pkg.atmeta.com/fairseq2/whl/pt2.3.0/cu118
Warning
fairseq2 relies on the C++ API of PyTorch which has no API/ABI compatibility between releases. This means you have to install the fairseq2 variant that exactly matches your PyTorch version. Otherwise, you might experience issues like immediate process crashes or spurious segfaults. For the same reason, if you upgrade your PyTorch version, you must also upgrade your fairseq2 installation.
For Linux, we also host nightly builds on FAIR's package repository. The
supported variants are identical to the ones listed in Variants above. Once
you have installed the desired PyTorch version, you can use the following
command to install the corresponding nightly package (shown for PyTorch 2.3.0
and variant cu118
):
pip install fairseq2\
--pre --extra-index-url https://fair.pkg.atmeta.com/fairseq2/whl/nightly/pt2.3.0/cu118
fairseq2 depends on libsndfile, which can be installed via Homebrew:
brew install libsndfile
To install fairseq2 on ARM64-based (i.e. Apple silicon) Mac computers, run:
pip install fairseq2
This command will install a version of fairseq2 that is compatible with PyTorch hosted on PyPI.
At this time, we do not offer a pre-built package for Intel-based Mac computers. Please refer to Install From Source to learn how to build and install fairseq2 on Intel machines.
Besides PyPI, fairseq2 also has pre-built packages available for different PyTorch versions hosted on FAIR's package repository. The following matrix shows the supported combinations.
fairseq2 | PyTorch | Python | Arch |
---|---|---|---|
HEAD |
2.3.0 |
>=3.8 , <=3.11 |
arm64 |
To install a specific combination, first follow the installation instructions on
pytorch.org for the desired PyTorch
version, and then use the following command (shown for PyTorch 2.3.0
):
pip install fairseq2\
--extra-index-url https://fair.pkg.atmeta.com/fairseq2/whl/pt2.3.0/cpu
Warning
fairseq2 relies on the C++ API of PyTorch which has no API/ABI compatibility between releases. This means you have to install the fairseq2 variant that exactly matches your PyTorch version. Otherwise, you might experience issues like immediate process crashes or spurious segfaults. For the same reason, if you upgrade your PyTorch version, you must also upgrade your fairseq2 installation.
For macOS, we also host nightly builds on FAIR's package repository. The
supported variants are identical to the ones listed in Variants above. Once
you have installed the desired PyTorch version, you can use the following
command to install the corresponding nightly package (shown for PyTorch 2.3.0
):
pip install fairseq2\
--pre --extra-index-url https://fair.pkg.atmeta.com/fairseq2/whl/nightly/pt2.3.0/cpu
fairseq2 does not have native support for Windows and there are no plans to support it in the foreseeable future. However, you can use fairseq2 via the Windows Subsystem for Linux (a.k.a. WSL) along with full CUDA support introduced in WSL 2. Please follow the instructions in the Installing on Linux section for a WSL-based installation.
See here.
We always welcome contributions to fairseq2! Please refer to Contribution Guidelines to learn how to format, test, and submit your work.
If you use fairseq2 in your research and wish to refer to it, please use the following BibTeX entry.
@software{balioglu2023fairseq2,
author = {Can Balioglu},
title = {fairseq2},
url = {http://github.com/facebookresearch/fairseq2},
year = {2023},
}
This project is MIT licensed, as found in the LICENSE file.