Unsupervised Multimodal Clustering for Semantics Discovery in Multimodal Utterances (ACL 2024)
-
Updated
May 23, 2024 - Python
Unsupervised Multimodal Clustering for Semantics Discovery in Multimodal Utterances (ACL 2024)
A multimodal face liveness detection module that can be used in the context of face anti-spoofing
MELD: A Multimodal Multi-Party Dataset for Emotion Recognition in Conversation
Context-Dependent Sentiment Analysis in User-Generated Videos
Multimodal sentiment analysis using hierarchical fusion with context modeling
Project for Multimodal Interaction course (A.Y. 2019/2020), GesturePad
Repository to contain MMI development during UTA CSE REU 2019. Makes use of Open-Myo, a module to get data from a Myo armband using a generic BLE interface.
A multimodal skill built with Amazon Alexa Skills Kit that educates children on the importance of numbers and dates.
Add a description, image, and links to the multimodal-interactions topic page so that developers can more easily learn about it.
To associate your repository with the multimodal-interactions topic, visit your repo's landing page and select "manage topics."