Skip to content
Switch branches/tags

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time


ReLink in a nutshell

ReLink consists of two consecutive modules:

For consistency in terms of evaluation we reuse and adapt NewsReader's evaluation scripts found at:

Run ReLink

  1. Run the hybrid annotator (ReLink step 1 out of 2) to annotate entities in .txt files. Please contact @giusepperizzo for the latest version of this annotator.
  2. Run scripts/ to produce annotations in conll extended format (.conlle).
  3. Run scripts/ to convert all annotated files to .naf format, which is used by ReCon and by our scoring scripts.
  4. Run scripts/ to rerank the results according to ReCon, the second step of ReLink.

Note: Along these steps, ensure you supply the correct parameters to the scripts. If any problems, feel free to contact us.

Dependencies for the ReCon module:

Summary of the format workflow

txt: plain text , token: tokenized text -> adel -> conlle: annotations in conll extended format

conlle -> -> naf

naf -> recon -> out:reranked links

LREC 2016 experiments


For convenience of potential replicators, we provide the .naf versions of the gold standard versions of these datasets in the GOLD/ folder.

If desired, our replicators are also welcome to download the TSV version of AIDA-YAGO2 and convert it to NAF themselves using scripts/

The files found on github which implement the sequence of four instructions to run ReLink noted above, contain the default settings to evaluate ReLink on the datasets AIDA-YAGO2 and MEANTIME.


The adapted version of NewsReader's scorers can be found in the ned-evaluation/ folder. See the script for the set of commands we used to evaluate our solutions for our LREC 2016 paper. For understanding of potential replicators, we detail briefly the operation of these scoring functions. They consist of three Perl scripts:

  • Analyze system annotations (ned-evaluation/ or ned-evaluation/ and transform the .naf files in a folder to a single output file
  • Analyze gold standard data (ned-evaluation/ or ned-evaluation/ and transform the .naf files in a folder to a single output file
  • Compare the outputs of the previous two files, using the script ned-evaluation/ For correct evaluation, you should ensure the filenames of the previous two steps coincide.



  • Giuseppe Rizzo
  • Filip Ilievski
  • Marieke van Erp
  • Julien Plu
  • Raphael Troncy


Context-enhanced Adaptive Entity Linking



No releases published


No packages published