Interpolation of scheduled simulation results for auralization
Master Thesis of Garrett Fatela, João
Auralization is a technique which allows the rendering of realistic 3D audio based on virtual spatial scenes. Certain sound propagation parameters (or so-called auralization parameters) such as propagation delay, atmospheric attenuation, and spreading loss, can be derived from physics-based simulations of virtual scenes. These parameters are essential to rendering plausible sound signals in a spatial context. In the auralization of complex scenes, the simulations can be very computationally demanding. This can become a bottleneck for real-time auralization: The amount of resources required for the simulation must not compromise the real-time audio output. As a result, the simulations are run at a much lower rate than the audio rendering operations. Hence, sound propagation parameter values may "jitter" over time, which can lead to audible artifacts in the output signal. This is especially true for the simulation of fast-moving sound sources, like aircraft, since parameters may change drastically in relatively small time intervals. In this thesis, different methods are studied, tested, and implemented for the interpolation of sound propagation parameters in the context of aircraft flyovers. This is thought to be an elegant solution to both upsample and smooth out the auralization parameter data with the goal of avoiding audible artifacts in real-time auralization.