π HapticCap: A Multimodal Dataset and Task for Understanding User Experience of Vibration Haptic Signals
Arxiv: https://arxiv.org/pdf/2507.13318? (Findings of EMNLP2025)
HapticCap is a multimodal dataset and benchmark task designed for understanding user experience of vibration-based haptic signals.
It provides a new resource for research at the intersection of haptics, text, and multimodal learning.
-
Modality: Vibration haptic signals, paired with textual annotations
-
Textual Annotations:
- Sensory: It refers to physical attributes (e.g.,intensity of tapping).
- Emotional: It refers to emotional denotes affective impressions (e.g., the mood of a scene).
- Associative: It indicates real-world familiar experiences (e.g., buzzing of a bee, a heartbeat).
-
Format: Signals stored as time-series data; annotations in JSON with haptic signal ID
-
Scale:
Google drive:
Haptic Vibration Signals: https://drive.google.com/drive/folders/1xylMC-EFswTc3adcc6rAzyFsXLSmVweg?usp=drive_link
Human Descriptions: https: https://drive.google.com/drive/folders/1ovlIbfJecXAq0TbItmrRl5dVV7OCCQzB?usp=drive_link
or find the data in: https://huggingface.co/datasets/GuiminHu/HapticCap
-
Haptic-caption retrieval: Its objective is to retrieve the textual descriptions of three categories that correspond to a given haptic signal, using the haptic signal as the query and the descriptions as the target documents.
-
Training, Valid Test set in Huggingface:
https://huggingface.co/datasets/GuiminHu/HapticCap
We design supervised contrastive learning framework that aims to pull the clusters of points belonging to the same class together in an embedding space and simultaneously pushes apart clusters of samples from different classes.
If you find this dataset useful for your research, please cite our paper:
@article{hu2025hapticcap,
title={Hapticcap: A multimodal dataset and task for understanding user experience of vibration haptic signals},
author={Hu, Guimin and Hershcovich, Daniel and Seifi, Hasti},
journal={arXiv preprint arXiv:2507.13318},
year={2025}
}
