Motion Behavior Predicts Speech Perception Difficulty in Virtual Audio- Visual Multi-Talker Environments (en)
* Presenting author
Abstract:
In natural listening situations, people typically move their head and eyes, a strategy that likely enhances the received sensory information to explore the auditory scene. However, the actual pattern of such strategy has not yet been described. Here, we investigated how head and eye movements are affected by reverberation and the number of spatially distributed simultaneous talkers in a scene. In a speech comprehension and localization task, 13 normal-hearing listeners had to indicate the source of a story in the presence of other stories. The audio-visual scenes were presented via a loudspeaker array and virtual-reality glasses. The listeners’ first head movement was found to be delayed with more talkers in the scene. Both reverberation and a higher number of talkers increased the duration of the subsequent search period, the number of fixated source locations and the number of gaze jumps. With more talkers, the decision period after the last head movement was prolonged since the listeners continued to move their eyes in the proximity of the target. For reverberant scenes, the final head position was further away from the target position. Overall, the results demonstrate that the complexity of the acoustic scene affects listener behavior during speech comprehension and localization.