Skip to content

Latest commit

 

History

History

fast_attention

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

Performer's Fast Attention (FAVOR+) Module.

See "Rethinking Attention with Performers" (ICLR 2021, Oral) for the paper associated with this library, as well as the corresponding Google AI Blog post.

We currently have FAVOR+ (Softmax and Generalized variants) written in Jax and Tensorflow.

If you found this codebase useful, please consider citing the paper:

@inproceedings{performer,
  author    = {Krzysztof Choromanski and
               Valerii Likhosherstov and
               David Dohan and
               Xingyou Song and
               Andreea Gane and
               Tam{\'{a}}s Sarl{\'{o}}s and
               Peter Hawkins and
               Jared Davis and
               Afroz Mohiuddin and
               Lukasz Kaiser and
               David Belanger and
               Lucy Colwell and
               Adrian Weller},
  title     = {Rethinking Attention with Performers},
  booktitle = {International Conference on Learning Representations, {ICLR} 2021},
  year      = {2021},
}