Powered by OpenAIRE graph
Found an issue? Give us feedback

INTERFACE

Role of multisensory integration in facial emotion understanding; developmental and neuropsychological approaches
Funder: French National Research Agency (ANR)Project code: ANR-11-EMCO-0002
Funder Contribution: 220,001 EUR

INTERFACE

Description

The general objective of the project is to investigate the role of multisensory integration in the ability to recognize and understand facial emotion, by considering both developmental aspects in infancy and neuropsychological aspects in schizophrenia. The main hypothesis is that multisensory integration boosts the ability to recognize facial emotion and its development by the way of matching between own facial reactions to multisensory stimulation and facial expression of conspecifics. One question is how and when such a matching mechanism might influence facial expression processing. This issue will be addressed in studying healthy adults in using both behavioral and electrophysiological approaches. We will also investigate the role of matching self-generated expressions and expressions of others by studying patients with a deficit in both aspects of expressiveness and recognition of expressions; namely, schizophrenic patients. The second hypothesis is that non-visual sensory modalities may shape and constrain the matching process. Each modality has its own sensory processes and hedonic properties, and positive/negative feelings arise from these specific sensations. A direct consequence is that no modality will integrate the emotional environment in the way that corresponds to facial emotion categories (e.g., happiness, anger, disgust, fear, and sadness), at least before the learning of associations. Thus, multisensory integration not only promotes coupling processes between self-generated expressions and expressions from others, but they can also shape them according to the characteristics of every modality, and according to their development. We will test this hypothesis in infants. This project will include three main tasks. The first task will hold on healthy adults. The main objective will be to understand whether, how, and when the olfactory context influences facial emotion recognition. In experiment 1, we will study the influence of the olfactory context on facial emotion categories to assess whether the boundaries of these categories are flexible as a function of the emotional meaning of the odour. In experiment 2, we will study the potential early influence of the olfactory context on visual attention, using a visual search paradigm. Finally, in experiment 3, we will study further at which (temporal) level the olfactory context is integrated, using ERPs recordings. The second task will include infant’s studies. The purpose here will be to see whether infants can associate odour-induced feelings and facial expressions and, in case of positive answer to this first question, at which age. In experiment 4, we will study infants’ matching abilities between an odour and dynamic facial expressions. In experiment 5, we will study whether the olfactory context elicits an infant’s expectations for specific facial actions in neutral static expressions. Finally, in experiment 6, we will study whether infants are able to integrate the emotional states of their mother from stress-elicited body odours, and if they express expectations for specific facial information. Finally, in the third task, we will study schizophrenic patients, who are known to have both a deficit in facial expression recognition and a deficit in expressiveness (i.e., flat affect). The purpose of this task will be to assess whether the deficit of facial expressiveness in schizophrenia may influence both the categorical perception of facial expression and the contextual effect of olfactory stimulations. To test for this hypothesis, we will adapt experiment 1 and experiment 3 (task 1) to the study of schizophrenic patients. This approach, coupling cognitive psychology, developmental psychology, ethology and neuropsychology, will allow a better apprehension of processes going on during multisensory integration and facial emotion recognition/understanding by confronting human cognitive and emotional processes at different stages: either mature, under acquisition, or altered.

Data Management Plans
Powered by OpenAIRE graph
Found an issue? Give us feedback

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

All Research products
arrow_drop_down
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::a726538ace3ec2e3e5710636d8addf5c&type=result"></script>');
-->
</script>
For further information contact us at helpdesk@openaire.eu

No option selected
arrow_drop_down