Skip to content

AlCorreia/FABIR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

94 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fully Attention-Based Information Retriever (FABIR)

FABIR is a reading comprehension model introduced in this IJCNN 2018 paper. It was designed for the SQuAD dataset with the goal of achieving low training and inference times, being at least 5 times faster than competing models. FABIR was inspired by Google's transformer and includes no recurrence. It is a simple feedforward network made powerful by attention mechanisms.

Requirements

Citation

If you find FABIR useful please cite us in your work:

@inproceedings{Correia2018,
  author = {Correia, Alvaro H. C. and Silva, Jorge L. M. and De Martins, C. Thiago and Cozman, G. Fabio},
  booktitle = {Proceedings of the International Joint Conference on Neural Networks},
  pages = {2799--2806},
  publisher = {IEEE},
  title = {A Fully Attention-Based Information Retriever},
  year = {2018}
}

About

A Fully Attention-Based Information Retriever

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published