Vision and Cognition

   

Overview

We study high-level visual perception in the human brain, with a focus on ecologically relevant and meaningful vision: how do I know where I am in space? How fast do I and that ball move? ; What emotion does that person feel? Have I not seen this place/object/person before? Our studies include links to attention, memory and social interactions. We collaborate with clinicians to gain insights on disorders such as autism, schizophrenia, and ADHD.

Methodologically we use non-invasive brain imaging (3T and 9.4T fMRI), and we can perturb neural processing using transcranial magnetic stimulation (TMS), also during fMRI (latest hardware!). For data analysis and modelling we use multivariate classifiers and old-fashioned stats. Our research questions currently include the following:

Illusions, Scene Segmentation, and Attention

Illusions allow separating visual processing from visual consciousness, and open windows to processes involved in scene segmentation, perceptual grouping, Gestalt perception, and reveal prior knowledge. We use primarily bi-stable stimuli and attention to examine these processes.

Natural scenes, motion, and space

Natural scenes (such as feature movies) are fascinating: they contain most visual input our brain evolved in. We study the interpretation of high-level motion in movies, but also of people, objects, and space. Using controlled paradigms we examine how the brain integrates visual signals with body-related signals (efference copies of muscle-movements, proprioceptive and vestibular signals) to provide perceptual stability. The aim is to understand the brain encodes our position in the environment, and how it reconstructs the 3D-space and objects around us based on visual input. - Motion, space, and memory are tightly interlinked.

 

Emotions and Social Interactions

How do we recognise emotional inflections in facial motion, in body posture, or by observing people interact? We study how dynamic changes of facial expressions and body posture are processed, and how visual and affective brain regions exchange information.

Methods

Stimuli. We use anything from highly controlled stimuli (such as 3D-dotfields), virtual reality, to natural movies. Special paradigms such as binocular rivalry and visual illusions allow dissociating pure processing from processing related to conscious perception, attentional control and decision making.

Patients. We are highly interested in understanding the mechanistic reasons that can lead to neglect, autism, ADHD, or schizophrenia. We therefore collaborate with clinicians (Neurologists and Psychiatrists) and examine their patients using our paradigms - purely behaviourally or also using fMRI.

Brain imaging:  fMRI (3T and 9.4T) and EEG. Analyses: we use standard statistics and multivariate approaches to gain insights into neural information content.

Brain stimulation: TMS. We use neuronavigated transcranial magnetic stimulation (TMS) to disturb perception, attentional processes and associated decision making, in order to test the causal involvement of brain regions in a task. Simultaneous TMS-fMRI is used to examine neural effects of various TMS protocols using the latest 7-channel surface coils.

Throughout most experiments we use eye tracking (EyeLink or Arrington).

Selected Publications

  • Bannert M and Bartels A (2018) Human V4 Activity Patterns Predict Behavioral Performance in Imagery of Object Color. Journal of Neuroscience 38(15) 3657-3668.
  • Grassi PR, Zaretskaya N, Bartels A (2018). A Generic Mechanism for Perceptual Organization in the Parietal Cortex. Journal of Neuroscience. 2018 Aug 8;38(32):7158-7169.
  • Schindler A and Bartels A (2017). Connectivity Reveals Sources of Predictive Coding Signals in Early Visual Cortex during Processing of Visual Optic Flow. Cerebral Cortex 27(5) 2885-2893.
  • Kwon S.Watanabe M, Fischer, E and Bartels A (2018). Attention reorganizes connectivity across networks in a frequency specific manner. Neuroimage 144: 217-226.
  • Bartels A (2014) (mini-review / dispatch). Visual Perception: Early Visual Cortex Fills In The Gaps. Current Biology, 24(13) R600–R602.
  • Bannert M, Bartels A (2013). Decoding the yellow of a gray banana. Current Biology 23(22), p. 2268-2272
  • Watanabe M, Bartels A, Macke J, Murayama Y, Logothetis NK  (2013). Temporal jitter of the BOLD signal reveals a reliable initial dip and improved spatial resolution. Current Biology 23(21), p. 2146–2150.
  • Zaretskaya N, and Bartels A (2013). Perceptual effects of stimulating V5/MT+ during binocular rivalry are state-specific. Current Biology 23(20):R919-20.
  • Zaretskaya N, Anstis S and Bartels A (2013). Parietal cortex mediates conscious perception of illusory gestalt. Journal of Neuroscience 33(2):523-31.
  • Schindler A and Bartels A (2013). Parietal cortex codes for egocentric space beyond the field of view. Current Biology, 23(2):177-182.
  • Fischer E, Bülthoff HH, Logothetis NK and Bartels A (2012) Human Areas V3A and V6 Compensate for Self-Induced Planar Visual Motion. Neuron 73(6) 1228-1240.

Group Leader and Further Information

Andreas Bartels
Vision and Cognition
Werner Reichardt Centre for Integrative Neuroscience
Otfried-Mueller-Str. 25
72076 Tübingen
Germany

Phone: +49 (0)7071 29-89168
Write an E-Mail