Introducing an Unreal Engine toolkit for audio-visual perception studies in the virtual lab IHTApark (en)
* Presenting author
Abstract:
Analyzing people’s perceived quality of urban spaces is a difficult task due to the lack of standardized methods to evaluate architectural design, especially in situations where sound matters. In order to conduct such experiments under reproducible conditions, it has been proposed to use virtual audio-visual models. An example of a VR model, that has recently been developed as a study framework, is the digital twin of a green space called IHTApark. Its visual appearance is implemented and rendered in Unreal Engine while the sound is reproduced by the open-source software Virtual Acoustics. It allows including fully synthesized sound sources as well as real-life recordings which enhance the realism of acoustic environments.This poster introduces a toolkit to conduct perception studies in such virtual environments. Most importantly, the toolkit contains a user interface for Unreal Engine to handle stimulus selection and collect subjective ratings. Mushra-like ratings are used to perform sensory evaluations like the individual vocabulary profiling. The interface is realized in the form of an open-source plugin for Unreal Engine and uses a TCP server/client structure to communicate with, e.g. Matlab, where experiment parameters can be controlled and results are processed.