Skip to content

jwgcurrie/Robot-action-perception-consequential-sound

Repository files navigation

Sonic Sleight of Hand: Sound Induces Illusory Distortions in the Perception and Prediction of Robot Action

Code and data repository for the following publication: https://doi.org/10.1007/s12369-024-01105-5

Cite as: @article{currie2024sonic, title={Sonic Sleight of Hand: Sound Induces Illusory Distortions in the Perception and Prediction of Robot Action}, author={Currie, Joel and Giannaccini, Maria Elena and Bach, Patric}, journal={International Journal of Social Robotics}, pages={1--19}, year={2024}, publisher={Springer} }

Do sounds affect human perception & prediction of a robot’s actions? We found that subtle changes to robot sounds modulate people's localisation responses & push real-time perception towards the predicted future of robot actions.

Human collaboration is built on our ability to effortlessly predict another’s behaviour. From manufacturing to the food delivery, interaction between humans & robots is becoming more common–unfortunately, people find robot movement unpredictably, hindering the interaction.

What is often overlooked is consequential sound (the sound a robot makes as it moves – motors, gears etc.) as a channel of information between the human and robot, conveying unintentional data of the robot’s motion.

We hypothesised that through multisensory integration, a robot’s consequential sound would be involuntarily meshed with vision, biasing human visual perception away from a ‘true’ representation of the robot’s actions.

In our first experiment, we showed participants a robot completing a reach, or a withdrawal. We manipulated the consequential sound of the robot to end 100 ms before, or after the stopping point of the motion. Participants then used a computer mouse to localise the robot’s hand. Methods1a_1b

We found that participants systematically overestimated the robot’s motion, when the robot’s sound ended after its action, relative to when the robot’s sound ended before the motion, in which participants underestimated the robot’s action. We then replicated our findings in a preregistered experiment, in fact finding that when we made the robot’s motion more variable, people were more influenced by the changes to the robot’s sound.

Results_Expt_1a1b-1

In a second series of experiments, we changed our measurement - participants would make a perceptual judgement after the motion whether a presented image/probe (showing the immediate future/past of the action) was the same/different to the last position of the robot. Methods2

These two experiments (the second a preregistered replication) found that when participants saw the robot’s action with the longer robot sound, probe selection was pushed further into the future of the robot’s action, relative to the shorter sounds which were not pushed as far.

Expt2a_2b

What does this mean? The consequential sound a robot makes as it moves can not only affect how you respond to its action, but can also adjust your real-time visual perception towards your predictions of the robot’s movements.

Together these results offer new methods of measure how people really ‘perceive’ a robot’s actions, and that people’s representation of robot actions can be supplemented/contaminated by controlling illusory effects.

Sound is just one tool to bias robot motion. We hope that by understanding how people truly represent and predict robot action, these factors can be included/accounted for in the design of more predictable robot behaviours, improving human-robot collaboration.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages