Final Thesis

Short-term memory performance under audio-visual mismatch of lip-movements

Key Info

Basic Information

Professorship:
HTA
Status:
ongoing
Research Area:
Audio-visual Interaction
Type of Thesis:
Bachelor

Contact

Bachelor Thesis of Li, Tong

In modern life, the proportion of computer-mediated conversations is gradually increasing, e.g., lectures via zoom, or virtual reality (VR). In real-life, information is perceived multi-modal: in addition to visual information, sound as auditory information is received and remembered along with coherent visual cues, such as lip-movement. However, due to the instability of the network in video-calls or due to the quality of the VR scene design, there can be a mismatch between auditory and visual content, i.e., the lip movement and the heard voice. It is possible that this audio-visual mismatch impedes memory performance.

The aim of this thesis is to evaluate the effect of audio-visual mismatch regarding the lip movement and the spoken word on short-term memory by means of a serial recall listening experiment and to analyze the results with regard to known short-term memory mechanisms. Furthermore, it is investigated whether the real lip movements and the lip movements of the virtual people affect the results in a different way. The results can help for future development of VR scenes.