Babak Allahgholipour, Multi-Sensory Perception in Virtual World Animation Using Mcgurk Effect

Msc. Candidate: Babak Allahgholipour

Program: Multimedia Informatics

Date: 09.12.2019

Place: A-212

Abstract: Information about multisensory perception and brain cognition mechanism lead to a better virtual world creation. Brain store visual and auditory information to use in predicting future events. Image memory takes information from visual clues which during speech recognition happens by recording visual information and its auditory mappings. Based on this process, brain create prediction mechanism for future events. In speech recognition brain record sound and its visual presentation. When brain encounter difficulty in perception an auditory stimuli, it looks for visual mappings to guess the result. Considering the fact that some completely unrelated similarity may exist, some mapping may not lead to better understanding the world around. When previous information mislead the brain to understand the situation cognitive biases happen. McGurk Effect discusses how different lip movements affect auditory perception to get disparate information. Variety of factors influence amount of dependency on visual clues. Tracking speaking, especially in challenging environments, can be increased by visual clues such as face expressions, tongue and lips movements. Considering these elements in virtual world and game designing lead to improving believability. Perception processes, brain nature, designing factors, graphical elements, and visual structures are discussed to find out improved ways in designing realistic auditory and visual components.