Code for training a Neural Open IE model (NAACL2018)
Switch branches/tags
Nothing to show
Clone or download
Latest commit 455b6b0 Sep 18, 2018


The maintenance of this project has moved to the AllenNLP framework.
Over at the models page you can find train and prediction instructions, as well as an online demo.

Table of Contents generated with DocToc


Code for training a supervised Neural Open IE model, as described in our NAACL2018 paper.
🚧 Still under construction 🚧

Citing πŸ”–

If you use this software, please cite:

  author    = {Gabriel Stanovsky and Julian Michael and Luke Zettlemoyer and Ido Dagan},
  title     = {Supervised Open Information Extraction},
  booktitle = {Proceedings of The 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL HLT)},
  month     = {June},
  year      = {2018},
  address   = {New Orleans, Louisiana},
  publisher = {Association for Computational Linguistics},
  pages     = {(to appear)},

Quickstart 🐣

  1. Install requirements πŸ™‡
pip install requirements.txt
  1. Download embeddings 🚢
cd ./pretrained_word_embeddings/
  1. Train model πŸƒ
cd ./src
python  ./rnn/  --train=../data/train.conll  --dev=../data/dev.conll  --test=../data/test.conll --load_hyperparams=../hyerparams/confidence.json```

NOTE: Models are saved by default to the models dir, unless a "--saveto" command line argument is passed. See for more details.

  1. Predict with a trained model πŸ‘
python ./ \
    --model=path/to/model \

More scripts 🚴

See src/scripts for more handy scripts. Additional documentation coming soon!