Skip to content

microsoft/Transformer-XH

Microsoft Open Source Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct.

Resources:

Transformer-XH

The source codes of the paper "Transformer-XH: Multi-evidence Reasoning with Extra Hop Attention (ICLR 2020)".

Dependency Installation

First, Run python setup.py develop to install required dependencies for transformer-xh. Also install apex (for distributed training) following official documentation here.

Data and trained model Download

You can run bash script download.sh

For Hotpot QA, we provide processed graph (Transformer-XH) input here, after downloading, unzip it and put into ./data folder We also provide trained model here, unzip the downloaded model and put into ./experiments folder

Similarly, we provide processed graph in fever here, and trained model here.

Run Your Models

Use hotpot_train.sh for training on hotpot QA task, hotpot_eval.sh for evaluation (default fp16 training).

Similarly, fever_train.sh for training on FEVER task, fever_eval.sh for evaluation (default fp16 training).

Contact

If you have questions, suggestions and bug reports, please email chenz@cs.umd.edu and/or Chenyan.Xiong@microsoft.com.

About

No description, website, or topics provided.

Resources

License

MIT, MIT licenses found

Licenses found

MIT
LICENSE
MIT
LICENSE.txt

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published