Skip to content
DELTA is a deep learning based natural language and speech processing platform.
Python C++ Shell Perl Makefile C Other
Branch: master
Clone or download
GaryGao99 and hankun11 add focal loss (#40)
* add focal loss

* fix import focal_loss

* rm soft_labels

* fix test_focal_loss

* fix focal loss
Latest commit 0a0eae8 Aug 22, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github move template file location (#21) Aug 14, 2019
.pip merge master Aug 8, 2019
delta add focal loss (#40) Aug 22, 2019
deltann fix deltann makefile Aug 10, 2019
docker update doc Aug 13, 2019
docs refactor docs for tutorials using delta/egs Aug 16, 2019
dpl Update model.yaml Aug 10, 2019
egs Remove missed feature import (#38) Aug 20, 2019
tools rm useless Aug 19, 2019
utils update dpl Aug 7, 2019
.clang-format first commit for github Jul 4, 2019
.editorconfig first commit for github Jul 4, 2019
.flake8 fix flake8 config Aug 13, 2019
.gitconfig merge part files Aug 5, 2019
.gitignore merge Aug 9, 2019
.gitlab-ci.yml merge Aug 9, 2019
.pylintrc first commit for github Jul 4, 2019
.style.yapf first commit for github Jul 4, 2019
.travis.yml ci debug Jul 29, 2019
.vimrc add models, benchmark, etc Jul 24, 2019 simple fix Jul 4, 2019
LICENSE first commit for github Jul 4, 2019 fix png location Aug 15, 2019 fix Aug 12, 2019

Build Status Contributions welcome GitHub top language GitHub Issues License

DELTA - A DEep learning Language Technology plAtform

What is DELTA?

DELTA is a deep learning based end-to-end natural language and speech processing platform. DELTA aims to provide easy and fast experiences for using, deploying, and developing natural language processing and speech models for both academia and industry use cases. DELTA is mainly implemented using TensorFlow and Python 3.

For details of DELTA, please refer to this paper.

What can DELTA do?

DELTA has been used for developing several state-of-the-art algorithms for publications and delivering real production to serve millions of users. It helps you to train, develop, and deploy NLP and/or speech models, featuring:

  • Easy-to-use
    • One command to train NLP and speech models, including:
      • NLP: text classification, named entity recognition, question and answering, text summarization, etc
      • Speech: speech recognition, speaker verification, emotion recognition, etc
    • Use configuration files to easily tune parameters and network structures
  • Easy-to-deploy
    • What you see in training is what you get in serving: all data processing and features extraction are integrated into a model graph
    • Uniform I/O interfaces and no changes for new models
  • Easy-to-develop
    • Easily build state-of-the-art models using modularized components
    • All modules are reliable and fully-tested

Table of Contents


Quick Installation

We use conda to install required packages. Please install conda if you do not have it in your system.

We provide two options to install DELTA, nlp version or full version. nlp version need minimal requirements and only installs NLP related packages: Note: Users from mainland China may need to set up conda mirror sources, see ./tools/install/ for details.

# Run the installation script for NLP version, with CPU or GPU.
cd tools
./install/ nlp [cpu|gpu]

If you want to use both NLP and speech packages, you can install the full version. The full version needs Kaldi library, which can be pre-installed or installed using our installation script.

cd tools
# If you have installed Kaldi
KALDI=/your/path/to/Kaldi ./install/ full [cpu|gpu]
# If you have not installed Kaldi, use the following command
# ./install/ full [cpu|gpu]

To verify the installation, run:

# Activate conda environment
conda activate delta-py3.6-tf1.14
# Or use the following command if your conda version is < 4.6
# source activate delta-py3.6-tf1.14

# Add DELTA enviornment

# Generate mock data for text classification.
pushd egs/mock_text_cls_data/text_cls/v1

# Train the model
python3 delta/ --cmd train_and_eval --config egs/mock_text_cls_data/text_cls/v1/config/han-cls.yml

Manual installation

For advanced installation, full version users, or more details, please refer to manual installation.

Docker install

For Docker users, we provide images with DELTA installed. Please refer to docker installation.

Quick Start

Existing Examples

DELTA organizes many commonly-used tasks as examples in egs directory. Each example is a NLP or speech task using a public dataset. We provide the whole pipeline including data processing, model training, evaluation, and deployment.

You can simply use the under each directory to prepare the dataset, and then train or evaluate a model. For example, you can use the following command to download the CONLL2003 dataset and train and evaluate a BLSTM-CRF model for NER:

pushd ./egs/conll2003/seq_label/v1/
python3 delta/ --cmd train --config egs/conll2003/seq_label/v1/config/seq-label.yml
python3 delta/ --cmd eval --config egs/conll2003/seq_label/v1/config/seq-label.yml


There are several modes to start a DELTA pipeline:

  • train_and_eval
  • train
  • eval
  • infer
  • export_model

Before run any command, please make sure you need to source in the current command prompt or a shell script.

You can use train_and_eval to start the model training and evaluation:

python3 delta/ --cmd train_and_eval --config <your configuration file>.yml

This is equivalent to:

python3 delta/ --cmd train --config <your configuration file>.yml 
python3 delta/ --cmd eval --config <your configuration file>.yml 

For evaluation, you need to prepare a data file with features and labels. If you only want to do inference with feature only, you can use the infer mode:

python3 delta/ --cmd infer --config <your configuration file>.yml 

When the training is done, you can export a model checkpoint to SavedModel:

python3 delta/ --cmd export_model --config <your configuration file>.yml 


For model deployment, we provide many tools in the DELTA-NN package. We organize the model deployment scripts under ./dpl directory.

  • Put SavedModel and configure model.yaml into dpl/model.
  • Use scripts under dpl/gadapter to convert model to other deployment model.
  • All compiled tensorflow libs and delta-nn libs are in dpl/lib.
  • Test, benchmark or serve under docker.


In DELTA, we provide experimental results for each task on public datasets as benchmarks. For each task, we compare our implementation with a similar model chosen from a highly-cited publication. You can reproduce the experimental results using the scripts and configuration in the ./egs directory. For more details, please refer to released models.

NLP tasks

Task Model DataSet Metric DELTA Baseline Baseline reference
Sentence Classification CNN TREC Acc 92.2 91.2 Kim (2014)
Document Classification HAN Yahoo Answer Acc 75.1 75.8 Yang et al. (2016)
Named Entity Recognition BiLSTM-CRF CoNLL 2003 F1 84.6 84.7 Huang et al. (2015)
Intent Detection (joint) BiLSTM-CRF-Attention ATIS Acc 97.4 98.2 Liu and Lane (2016)
Slots Filling (joint) BiLSTM-CRF-Attention ATIS F1 95.2 95.9 Liu and Lane (2016)
Natural Language Inference LSTM SNLI Acc 80.7 80.6 Bowman et al. (2016)
Summarization Seq2seq-LSTM CNN/Daily Mail RougeL 27.3 28.1 See et al. (2017)
Pretrain-NER ELMO CoNLL 2003 F1 92.2 92.2 Peters et al. (2018)
Pretrain-NER BERT CoNLL 2003 F1 94.6 94.9 Devlin et al. (2019)

Speech tasks


Task Model DataSet Metric DELTA Baseline Baseline reference
Speech recognition CTC
Speaker verfication TDNN VoxCeleb EER 3.028 3.138 Kaldi
Emotion recognition ResNet IEMOCAP ACC 59.15 56.10 Neumann and Vu (2017)
Emotion recognition RNN-mean pool IEMOCAP ACC 65.23 56.90 Mirsamadi et al. (2017)


See FAQ for more information.


Any contribution is welcome. All issues and pull requests are highly appreciated! For more details, please refer to the contribution guide.


Please cite this paper when referencing DELTA.

       author = {{Han}, Kun and {Chen}, Junwen and {Zhang}, Hui and {Xu}, Haiyang and
         {Peng}, Yiping and {Wang}, Yun and {Ding}, Ning and {Deng}, Hui and
         {Gao}, Yonghu and {Guo}, Tingwei and {Zhang}, Yi and {He}, Yahao and
         {Ma}, Baochang and {Zhou}, Yulong and {Zhang}, Kangli and {Liu}, Chao and
         {Lyu}, Ying and {Wang}, Chenxi and {Gong}, Cheng and {Wang}, Yunbo and
         {Zou}, Wei and {Song}, Hui and {Li}, Xiangang},
       title = "{DELTA: A DEep learning based Language Technology plAtform}",
       journal = {arXiv e-prints},
       year = "2019",
       url = {},


The DELTA platform is licensed under the terms of the Apache license. See LICENSE for more information.


The DELTA platform depends on many open source repos. See References for more information.

You can’t perform that action at this time.