Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
-
Updated
Nov 30, 2022 - Python
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
An Open-source Strong Baseline for SE(3) Planning in Autonomous Drone Racing
Implementation of Lie Transformer, Equivariant Self-Attention, in Pytorch
Robot Motion Estimate: Tools, Variables, and Factors for SLAM in robotics; also see Caesar.jl.
SE3 interpolation and Quat+R^3 interpolation are implemented.
Lie groups and algebra with some quaternions
Numerically stable implementation of batched SE(3) exponential and logarithmic maps
Add a description, image, and links to the se3 topic page so that developers can more easily learn about it.
To associate your repository with the se3 topic, visit your repo's landing page and select "manage topics."