Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
-
Updated
Nov 30, 2022 - Python
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Implementation of Lie Transformer, Equivariant Self-Attention, in Pytorch
Numerically stable implementation of batched SE(3) exponential and logarithmic maps
Add a description, image, and links to the se3 topic page so that developers can more easily learn about it.
To associate your repository with the se3 topic, visit your repo's landing page and select "manage topics."