Skip to content
Go to file


This repository contains the models and supplementary data for the paper A Neural Network Multi-Task Learning Approach to Biomedical Named Entity Recognition by Gamal Crichton, Sampo Pyysalo, Billy Chiu and Anna Korhonen.

The supplementary data can be found in the file Additional file 1.pdf.

The corpora used for the experiments (which can be re-distributed) are in the data folder.
Note: The re-distribution status of the BioCreative IV Chemical and Drug (BC4CHEMD) named entity recognition task corpus is unclear but it can be publicly accessed at

The models can be found in the models folder.

There are several files in the models folder:

  • The MLP model used as a baseline for the experiments.

    Example Usage: python 'path/to/dataset' 'path/to/vectorfile'

  • The configurable variables and their values for the MLP baseline model (

  • The configurable variables and their values for the convolutional models.

  • The multi-task Dependent Model.

    Example usage: python 'path/to/data-files' 'dataset-1,...,dataset-n' 'path/to/vectorfile'

  • The multi-output multi-task model.

    Example usage: python 'path/to/data-files' 'dataset-1,...,dataset-n' 'path/to/vectorfile'

  • The model used in the multi-task experiments which investigated the effect of multi-task learning on datasets of various sizes.
    Specify the percent-keep command to determine how much of the training examples of dataset whose size you wish to vary to randomly keep. This must be the first dataset specified, all other datasets will train with full training data.

    Example usage: python --percent-keep 0.5 'path/to/data-files' 'path/to/reduced-dataset,path/to/whole-dataset' 'path/to/vectorfile'

  • The single task model.

    Example usage: python 'path/to/dataset' 'path/to/vectorfile'

Note: The experiments in the paper applied the Viterbi algorithm to the outputs. Use the --viterbi flag to replicate this.


The code is provided under MIT license and the other materials under Creative Commons Attribution 4.0.


No description, website, or topics provided.




No releases published


No packages published