MELD: A Multimodal Multi-Party Dataset for Emotion Recognition in Conversation
-
Updated
Mar 10, 2024 - Python
MELD: A Multimodal Multi-Party Dataset for Emotion Recognition in Conversation
Context-Dependent Sentiment Analysis in User-Generated Videos
Multimodal sentiment analysis using hierarchical fusion with context modeling
Unsupervised Multimodal Clustering for Semantics Discovery in Multimodal Utterances (ACL 2024)
A multimodal face liveness detection module that can be used in the context of face anti-spoofing
Project for Multimodal Interaction course (A.Y. 2019/2020), GesturePad
Repository to contain MMI development during UTA CSE REU 2019. Makes use of Open-Myo, a module to get data from a Myo armband using a generic BLE interface.
Multimodal AI Assistant with Google Gemini-1.5-pro, gTTS, PIL, and SpeechRecognition Technologies!
A multimodal skill built with Amazon Alexa Skills Kit that educates children on the importance of numbers and dates.
Add a description, image, and links to the multimodal-interactions topic page so that developers can more easily learn about it.
To associate your repository with the multimodal-interactions topic, visit your repo's landing page and select "manage topics."