Skip to content

djwei96/ESA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 

Repository files navigation

ESA: Entity Summarization with Attention

EYRE@CIKM'19 paper: "Entity Summarization with Attention"

ENVIRONMENT AND DEPENDENCY

Environment

  • Ubuntu 16.04
  • python 3.5+
  • pytorch 1.0.1
  • java 8

Dependency

pip install numpy
pip install tqdm

USAGE

Train

git clone git@github.com:WeiDongjunGabriel/ESA.git
cd .../ESA
cd model
python main.py

we also provide a commandline tool for training the ESA model, you can also run the following command for more details:

python main.py -h

for example, if you want to train the model in dbpedia, the commands are as follows:

python main.py \
    --db_name dbpedia \
    --mode train \
    --transE_dim 100 \
    --pred_embedding_dim 100 \
    --lr 0.0001 \
    --clip 50 \
    --loss_function BCE \
    --regularization False \
    --n_epoch 50 \
    --save_every 2

if you want to test the model and generate entity summarization results, the commands are as follows:

python main.py \
    --db_name dbpedia \
    --model test \
    --use_epoch 48

we also provdie a mode called "all" to train and test the model at the same time, the commands are as follows:

python main.py \
    --db_name dbpedia \
    --mode all \
    --transE_dim 100 \
    --pred_embedding_dim 100 \
    --lr 0.0001 \
    --clip 50 \
    --loss_function BCE \
    --regularization False \
    --n_epoch 50 \
    --save_every 2 \
    --use_epoch 48

Test

cd .../ESA
cd test
sh run.sh

CITATION

if you use our model or code, please kindly cite it as follows:

@inproceedings{ESA,
  author    = {Dongjun Wei and
               Yaxin Liu and
               Fuqing Zhu and
               Liangjun Zang and
               Wei Zhou and
               Jizhong Han and 
               Songlin Hu},
  title     = {ESA: Entity Summarization with Attention},
  booktitle = {EYRE@CIKM},
  year      = {2019}
}

About

Source code for paper: "ESA: Entity Summarization with Attention"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published