Entwicklung einer Experimentierumgebung f¨ur den perzeptiven Vergleich audiovisueller Szenen mit dynamischer Audiowiedergabe
Bachelorarbeit von Pilger, Laura
In recent years, Virtual reality (VR), especially when combined with virtual acoustics, has been a growing field in Biomedical Engineering due to its huge potential to create controllable and reproducible real life scenarios in testing environments. It has the advantage of being mutable and adaptable in regards to different scene settings, which leads to a broader spectrum of experiments one can conduct. Hence, in the context of this Bachelor thesis a virtual reality interface is programmed to facilitate the execution of audiovisual experiments. Embedded in the VR environment Unity, the experimental space is visualized. By using C# scripts, an interface between Unity, the auralisation framework Virtual Acoustics, the tracking system OptiTrack and the HTC Vive VR set with Head-Mounted Display (HMD) and controller is implemented. During a pilot experiment on audiovisual sound localisation, four blocks were conducted which only differed by their visual settings. The acoustical sound source positions, however, remained constant throughout the blocks. As it is well known that a varying amount of visual cues can have an impact on the human sound localisation, the experiment’s goal was to determine the influence of varying visual cues on the participants’ localisation ability in VR.