Perception, Attention, Action

Perception, Attention, Action : making sense of and interacting with our environment

General presentation

 

Perception, attention and action’ represents the main focus for 12 teams, with several intra-CRNL collaborations. Molecular, cellular, network, cognitive and computational approaches are developed to tackle, in humans and animal models, the brain mechanisms of perception and their modulation by attentional and emotional processes, both in contexts of action production and of (un)conscious perception.

 

Major contributions

 

Perception

Olfaction: The CMO and ENES teams have shown how the physicochemical properties of odorant molecules, learning and attention modulate odor sampling –a complex sensorimotor behavior. The NEUROPOP team performed a very large scale online olfactory test in Humans (50.000 French participants) devoted to odor identification, combined with computational sciences and neuroscience, and recently investigated COVID-19 related deficits. At the neurological level, the TIGER team has identified a preferential involvement of the left hemisphere in epileptic patients presenting with disorders of odor pleasantness judgement.

Audition: Virtual reality was used to shed light on sound localization in cochlear-implanted deaf people (IMPACT). Sound localization abilities were also described for the first time in crocodiles (CAP&ENES). Regarding non-verbal auditory cognition disorders, including music perception abilities, several approaches were developed by the PAM team. The CAP, PAM and TRAJECTOIRE teams showed how music can be a tool for cognitive remediation.

Vision: The COPHY team characterized the effect of attentional context on the frequency coupling of cortical oscillatory signals, using a combination of EEG-MEG and computational models. The IMPACT team revealed how eye movements plasticity modulates visuo-spatial attention, and the reverse coupling is now under investigation. This team has also characterized the interactions between visual attention and memory during free visual exploration tasks, both in naturalistic and virtual reality settings. The team further described the interactions between spatial (distance) and intrinsic (emotion) parameters of visual stimuli in their explicit and implicit appraisal.

Somatosensation: The WAKING team performed intracellular recordings of thalamic neurons during exploratory whisker movements. The IMPACT team obtained behavioral and computational evidence that the human brain represents hand-held tools as sensory, and not only motor body extensions. Rare patient cases of left hyperschematia showed how attention modulates body representation (TRAJECTOIRE). Intracranial thalamo-cortical recordings were performed to describe conscious pain perception by the NEUROPAIN team, which also investigated the cerebral substrates of hyperalgesia shedding light on the dysfunctions of somatic sensations.

Attention

The EDUWELL team has developed the ‘ATOLE’ program for the education of attention in children and teenagers which is now in place in more than 5000 classes. This team has also demonstrated the role of attention in mathematics in children and, using intracranial and surface EEG, has studied negative (distraction) and positive (meditation-based) modulations of self-focused attention. Other brain markers, such as the pre-stimulus brain functional connectivity, have been related to perception for both innocuous and noxious stimuli (NEUROPAIN) in waking and sleep. The PAM team also developed diagnostic and remediation tools for attention and auditory deficits adapted to children, adults and elderly.

Action, agency

The IMPACT team provided further insights into decision making by studying the basal ganglia and cortical substrates of movement vigor and urgency. In the social cognition domain, the team also explored the effects of the mere presence of a peer on our sensorimotor and cognitive performances.

 

Contacts

 

Denis Pélisson (IMPACT), denis.pelisson@inserm.fr

Luis Garcia-Larrea (NEUROPAIN), luis.garcia-larrea@inserm.fr

Yohana Leveque (CAP), yohana.leveque@inserm.fr