Skip to content
master
Switch branches/tags
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 
 
 

BERT-DST

Contact: Guan-Lin Chao (guanlinchao@cmu.edu)

Source code of our paper BERT-DST: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer (Interspeech 2019).

@inproceedings{chao2019bert,
title={{BERT-DST}: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer},
author={Chao, Guan-Lin and Lane, Ian},
booktitle={INTERSPEECH},
year={2019}
}

Tested on Python 3.6, Tensorflow==1.13.0rc0

Required packages (no need to install, just provide the paths in code):

  1. bert
  2. uncased_L-12_H-768_A-12: pretrained [BERT-Base, Uncased] model checkpoint. Download link in bert.

Datasets:

dstc2-clean, woz_2.0, sim-M and sim-R

About

BERT-DST: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published