Skip to content
This repository has been archived by the owner on Mar 30, 2022. It is now read-only.

bioRxiv 10k Evaluation

Daniel Ecer edited this page Jun 15, 2021 · 8 revisions

bioRxiv 10k Evaluation

bioRxiv 10k Dataset

We prepared and shared a CC-BY 4.0 subset of the bioRxiv data for training and evaluation purposes. For evaluation we will be using the test dataset.

The XML, in particular around references, contain inconsistencies or errors. Some of those errors have been fixed automatically (based on the XML file itself only).

Some of the fixes to the XML (see fix_jats_xml.py):

  • validate DOI, PMID, PMCID, PII (remove or fix annotation where not valid)
  • add missing DOI, PMID, PMCID, PII, external links (existing annotation will always be preferred, if valid)
  • remove PMCID from reference article titles
  • remove surrounding quotes

The revised XML files are available here:

ScienceBeam Judge

We are using ScienceBeam Judge as an evaluation tool for XML conversion. The evaluation method is based on GROBID’s End-to-End evaluation. It extends the evaluation by allowing it to handle lists or sets in a different way. ScienceBeam Judge comes with JATS and TEI XML field mappings, which allows it to directly evaluate the GROBID TEI output based on the JATS XML.

Evaluation Results

The fields are evaluated as (see ScienceBeam Judge evaluation):

  • Single text value: title, abstract, first author name, first reference text
  • Partial set: authors, affiliations
  • Partial ulist: references

For each text, the edit or Levenshtein distance is calculated. A text is considered matching if it’s score is at least 0.8.

The following models were tested:

Note: ScienceBeam here refers to variation of GROBID, that has been trained on bioRxiv.

ScienceBeam Evaluation Summary bioRxiv 10k test (title, abstract, affiliations, sections, figures and tables)

ScienceBeam Evaluation Summary bioRxiv 10k test (references)


Caption: Evaluation results on bioRxiv 10k test dataset (1998 documents, 2 documents failed to process for all of the models). Using the revised XML v0.0.13 (see above)

ScienceBeam Demo

The above model can tried out on our ScienceBeam Demo website.