This is a repository for some of our source code and information on INTERSPEECH 2020 Computational Par-alinguistic Challenge.
Update
This paper has been accepted by INTERSPEECH 2020, you can get the full paper here 👇:
https://isca-speech.org/archive/Interspeech_2020/pdfs/2999.pdf
It won the 2nd place in the elderly sub-challenge on recognizing elderly’s emotion, and ranked the 11th in the mask sub-challenge on predicting whether the speaker is wearing mask or not.
Citation
If you use this code, please consider citing:
@article{yang2020exploration,
title={Exploration of Acoustic and Lexical Cues for the INTERSPEECH 2020 Computational Paralinguistic Challenge},
author={Yang, Ziqing and An, Zifan and Fan, Zehao and Jing, Chengye and Cao, Houwei},
journal={Proc. Interspeech 2020},
pages={2092--2096},
year={2020}
}
Paper Name:
Exploration of Acoustic and Lexical Cues for the INTERSPEECH 2020 Computational Paralinguistic Challenge
Result:
On Test | Mask | Elderly (A/V) |
---|---|---|
baseline | 71.8 | 50.4/49 |
ours | 75.1 | 54.3/59 |
Our team members:
Ziqing Yang, Zifan An, Zehao Fan, Chengye JIng
Our Instructor:
Prof. Houwei Cao
For more information, please check out our website 👇:
https://sites.google.com/nyit.edu/seniorproject2020-interspeech
Please feel free to contact us:
zyang23@nyit.edu, zan01@nyit.edu, zfan06@nyit.edu, cjing@nyit.edu,