Acoustic Virtual Reality

  ITA

The interaction of human beings with their environment happens with a multitude of sensory stimuli. Besides the optical, haptic and tactile perception acoustics provides important information about the environment. One gains an impression of the environment, about external events and in particular feedback on one's own actions (proprioception = perceiving oneself). To achieve an optimal immersion in a virtual simulated environment the simulated stimuli for all senses must be consistent. Sound propagates through structures and via the air, is reflected by objects as for example walls in a room, and eventually reaches the place of the recipient. The human being then evaluates different properties of these sound cues. These are amongst others the volume, the quality of sound, the incidence direction of direct sound and the perceived distance. In addition, conclusions with regard to the environment in which the person finds himself can be drawn, in a room for example with regard to its size and shape. This happens because the fine structure of the cue response between sender and recipient which consists of superimpositions of the direct sound and its reflections is analyzed in a time-related manner. So a system was identified consisting of three series-connected components, the generation of sound, its propagation up to the recipient and finally the perception processes of the recipient can which are researched in the specialized area of psycho acoustics. Psycho acoustics defines in particular the quality requirements which have to be met by the components. In addition, for a VR system sound reproduction has to be considered in order to render the acoustic cues in a suitable manner to the ear.

  Visuell model of a church ITA

The generation of auditory stimuli is called auralization or "acoustic rendering". This should be understood in the same way as "visualization" The basis for auralization is the calculation of acoustic cue responses by means of suitable simulation technology. Nowadays there are a number of methods in numerics, geometrical as well as wave-based procedures. The challenge for research and development with regard to auralization and therefore for virtual acoustics depends on the requirements concerning real time processing. The generation of acoustic scenarios (rendering) and their reproduction must be effected as quickly as possible, so that there are no delays (latencies) or other artefacts.

Parallel to the progress made in spatial acoustic simulation technology complex models were developed for simulating vibro-acoustic issues as in case of sound absorption in buildings and with regard to transfer path analysis and synthesis in the automotive industry. The diversity of applications of virtual acoustics is very wide and goes far beyond listening to music or the quality assessment of concert halls. Sound design and researching sounds and noise in building acoustics, vehicle acoustics, noise control and psycho acoustics are further interesting application areas.

 

Video - Examples

Offline visualization and auralization using physics-based room acoustic simulation

 
Eurogress_binaural
Eurogress_binaural
 
 

Virtual Acoustics (VA) and the aixCAVE

 
aixCAVE_VATSS
 
aixCAVE_SanJuan
aixCAVE_SanJuan
 
 

Virtual Acoustics (VA) and Unity with HMD

The following videos have been created by Maurice Andreas during his Bachelor thesis on physics-based real-time auralization with the VR and gaming environment Unity. The material is licensed under Creative Commons Version 2.0.

 
VAUnity_AVStudy_Park
VAUnity_AVStudy_Park
 
 

Interactive park situation demonstrating the use of Unity and Virtual Acoustics (VA) for an audio-visual user study. The fictive task is to locate the chirping Robin in one of the trees with the pointing device, while audio source and (hard to find) visual prop may be dislocated. Hypothesis would be, which modality is dominating.

For an increased immersive experience, the binaural signal is not played back with headphones but with a transaural loudspeaker setup using dynamic multi-channel cross-talk cancellation.

 
VAUnity_ReverbDemo
VAUnity_ReverbDemo
 
 

Situation in a virtual room with a band playing music and some small-scale buildings on posts to demonstrate interaction with acoustics. By sticking the head into the models, the reverberation time is changed accordingly using a binaural artificial reverberation simulation. For movement, portal navigation is available using the hand controller. To interact with the auralization rendering modules, a menu called SoundPallette has been implemented that can change the auralization modes (e.g. DirectSound, EarlyReflections, DiffuseDecay, DopplerShifts, SpreadingLoss and others). Note: menu interaction demonstration by the user is missing in this clip.

For an increased immersive experience, the binaural signal is not played back with headphones but with a transaural loudspeaker setup using dynamic multi-channel cross-talk cancellation.