A Novel Interpretable and Generalizable Re-Synchronization Model for Cued Speech Based on a Multi-Cuer Corpus
This repo provides the codes and dataset link for our INTERSPEECH 2023 paper:
@inproceedings{is2023resync_cs,
title = {{A Novel Interpretable and Generalizable Re-Synchronization Model for Cued Speech Based on a Multi-Cuer Corpus}},
author = {Lufei Gao and Shan Huang and Li Liu},
booktitle = {Proceedings of the 24th Annual Conference of the International Speech Communication Association (INTERSPEECH)},
year = {2023}
}
Please CITE our paper when the MCCS dataset is used to help produce published results or is incorporated into other projects.
The dataset can be accessed from: https://mccs-2023.github.io/
test ssh