Skip to content

oudalab/eeqa

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Event Extraction by Answering (Almost) Natural Questions

Question answering for event extraction (trigger detection and argument extraction with various questioning strategies).

Paper link

If you use my code, please cite:

@article{du2020eeqa,
  title={Event Extraction by Answering (Almost) Natural Questions},
  author={Du, Xinya and Cardie, Claire},
  journal={arXiv preprint arXiv:2004.13625},
  year={2020}
}

Feel free to ask questions: xdu [at] cs [dot] cornell [dot] edu. http://www.cs.cornell.edu/~xdu/

Preprocessing (for ACE data)

Read ./proc/README.md for details

Requirement

See requirements.txt

Code

Train and eval models

  • Trigger Detection

    QA based model ([CLS] verb [SEP] input sentence [SEP]): bash ./code/script_trigger_qa.sh

  • Argument Extraction

    • With dynamic threshold: bash ./code/script_args_qa_thresh.sh

    • Without dynamic threshold: bash ./code/script_args_qa.sh

    • Get results on unseen arguments (Train on a set excluding unseen arguments and test on those): bash ./code/script_args_qa_unseen.sh

Question Templates

Template 1 (Role Name)

Template 2 (Role + Type): ./question_templates/arg_queries.csv

Template 3 (Annotation Guideline): ./question_templates/description_queries.csv


Unseen args for analysis, see unseen_args and all_args

About

Event Extraction by Answering (Almost) Natural Questions

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 97.7%
  • Shell 2.3%