Skip to content

vernadankers/neural_metaphor_discourse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

neural_metaphor_discourse

A code base for performing metaphor detection using neural models that incorporate discourse-level information.
The two sequence labelling models implemented are:

  1. Concatenated ELMo and GloVe embeddings fed to a one-layer LSTM model, followed by a linear classification layer and softmax;
  2. BERT-base-cased followed by a linear classification layer and softmax.

Discourse-level information is included through a discourse vector concatenated to the input of the linear classification layer. We define discourse through a window of size 2k+1, with k=0 including only the immediate sentential context.

Installation

Usage

python main.py --model "elmo" | "bert" --attention "general" | "hierarchical"
               --seed <int> --k <int> --lr <float> --epochs <int> --batch_size <int>
               --meta_train <filename> --meta_dev <filename> --meta_test <filename> --output <filename> 

Credits

Credits to my co-authors:
@kmalhotra30
@gkudva96
@vovamedentsiy
@eshutova

License

MIT

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages