Listening to, and remembering conversations between two talkers: Cognitive research using embodied conversational agents in audiovisual virtual environments

Key Info

Basic Information

Funding Program:
SPP2236 Audictive
Research Area:
Auditory Scene Analysis,
Binaural Technology,
Acoustic Virtual Reality

Listening to, and remembering the content of conversations is a fundamental aspect during communication within several contexts (e.g., private, professional, educational). Nevertheless, the impact of listening to realistic running speech, and not just to single syllables, words or isolated sentences, has been hardly researched from cognitive-psychological and/or acoustics perspectives. In particular, the influence of audiovisual characteristics, such as plausible spatial acoustics and visual co-verbal cues, on memory performance and comprehension of spoken text, has been largely ignored in previous research. The proposed project therefore aims to investigate the combined effects of these potentially performance-relevant but scarcely addressed audiovisual cues on memory and comprehension for running speech. Our overarching methodological approach is to develop an audiovisual Virtual Reality (VR) testing environment that includes embodied Virtual Agents (VAs). This testing environment will be used in a series of experiments to research the basic aspects of visual-auditory cognitive performance in a close(r)-to-real-life setting. This, in turn, will provide insights into the contribution of acoustical and visual cues on the cognitive performance, user experience, and presence in, as well as quality and vibrancy of VR applications, especially those with a social interaction focus.The project addresses three main objectives:In terms of auditory cognition, the focus is on studying the effects of systematic variations in the audiovisual 'realism' of virtual environments on memory and comprehension of multi-talker conversations. The results, in terms of measures of short-term memory, comprehension and listening effort, will help in developing theories in the field of auditory cognition research. With respect to interactive virtual environments, the project will investigate how fidelity characteristics in audiovisual virtual environments contribute to the realism and liveliness of social VR scenarios with embodied VAs. In particular, the project will study how combined verbal/acoustic and co-verbal/visual cues in such scenarios affect a user’s experience and cognitive performance and, in addition, will examine the technical quality criteria to be met in terms of spatial audio reproduction and visual immersion. Contributing to quality evaluation methods, the proposed project will study the suitability of text memory, comprehension measures and subjective judgements to assess the quality of experience (QoE) of a VR environment. Knowing which instances of enhanced realism in the VR environment lead to variations in either cognitive performance and/or subjective judgements is valuable in two ways: to determine the necessary degree of 'realism' for auditory cognition research into memory and comprehension of heard text, and the audiovisual characteristics of a perception-, cognition- and experience-optimized VR-system.