Package used for the analysis of the factored self-attention mechanism through a simple one-layer DCA model at:
- Caredda F., Pagnani A., Direct Coupling Analysis and the Attention Mechanism, biorxiv:579080
This is an unregistered package: to install enter ]
in the Julia repl and
pkg> add https://github.com/pagnani/AttentionDCA.jl
The functions for the training are
trainer, stat_trainer,
artrainer, stat_artrainer,
multi_trainer, stat_multi_trainer
multi_artrainer
These take as as inputs either tuples with integer-encoded MSA and weight vector
?trainer
A detailed example of the use of the package can be found inside notebooks/ExampleAttentionDCA
. To use it, it is necessary to clone the repository and follow the indications inside the notebook itself.
All data used in this study is publicly available at GitHub/francescocaredda/DataAttentionDCA in the "data" folder.
Any question can be directed to francesco.caredda@polito.it