Skip to content
BERT-DST: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer
Python Shell
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.gitignore
README.md
dataset_dstc2.py Update dataset_dstc2.py Jul 4, 2019
dataset_sim.py
eval_pred.sh -- Jul 1, 2019
main.py
metric_bert_dst.py
train.sh
util.py

README.md

BERT-DST

Contact: Guan-Lin Chao (guanlinchao@cmu.edu)

Source code of our paper BERT-DST: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer (Interspeech 2019).

@inproceedings{chao2019bert,
title={{BERT-DST}: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer},
author={Chao, Guan-Lin and Lane, Ian},
booktitle={INTERSPEECH},
year={2019}
}

Tested on Python 3.6, Tensorflow==1.13.0rc0

Required packages (no need to install, just provide the paths in code):

  1. bert
  2. uncased_L-12_H-768_A-12: pretrained [BERT-Base, Uncased] model checkpoint. Download link in bert.

Datasets:

dstc2-clean, woz_2.0, sim-M and sim-R

You can’t perform that action at this time.