Search for People

Notice - Personal Information Returning Shortly

In order for our website to be in full accordance with European law, we are currently in the process of ensuring we have right of use before publishing personal information of any kind. Information on our staff, CIN and board membership etc. will until further notice only include those personnel who have granted us explicit right of use.

Prof. Dr. Michael J. Black

Organization: Max Planck Institute for Intelligent Systems


Spemannstr. 41
72076 Tübingen

Phone number: +49 (0)7071 601 1801

Department: Department of Perceiving Systems

Area: CIN Members

Scientific topic: Neural Coding of Natural Stimuli and Behaviour

Field of Research

Our work in computational neuroscience is grounded in a hypothesis that to understand the nervous system, we must observe and model it in natural contexts.  This is in contrast to much of the history of neuroscience, in which simple stimuli have been used to isolate and model properties of the system.  For example, a visual stimulus like a sine-wave grating may vary in frequency, contrast, orientation or direction of motion.  By varying these parameters one can observe how the neural reacts, resulting in a “model” of neural tuning.  The question is whether such models generalize to natural scenes experienced by behaving organisms. Are such models realistic or does the neural code vary with context? These questions can be asked throughout the nervous system and we consider both visual perception and motor behaviour. In both cases we develop naturalistic data and experiments to explore how the neural system behaves and we develop new computational models that capture this behaviour.

The Neural Control of Natural Movement

Much of what we know about motor control comes from animal models in an environment that constrains behaviour to simple, limited movements. These experiments show that simple reaching motions can be decoded from small populations of spiking neurons. It is unclear whether these findings hold for more complex, full-body behaviours in unconstrained settings. To ask this question we need two things: 1) the ability to record from neurons in the brain of a freely moving animal and 2) the ability to record the behaviour of the animal (e.g. the kinematics of its limbs) in a way that leaves it unconstrained.  There is a third important requirement – we must be able to mathematically analyze and model the relationships between high-dimensional behavioural signals and high-dimensional neural signals in situations where the behaviour is not repeated many times.

To address the first problem of neural recording, we collaborate with the group of Krishna Shenoy at Stanford University, which has developed a wireless, broadband, multi-channel technology that can record from dozens of individual neurons in the motor cortex of a monkey.  The data is transmitted from the animal wirelessly, allowing them the freedom of natural movement. The second problem involves capturing their behaviour. To do so, we have used marker-based motion capture technology in our work with the Donoghue lab at Brown University.  With this we can capture the full kinematics of the arm and hand (shoulder, elbow, wrist and fingers) of monkeys making natural reaching movements.  We have shown that we can decode such movements faithfully from as few as 30 neurons in primary motor cortex.  Marker-based techniques, however, are not practical for capturing full-body animal movement and consequently we have set up an 8-camera capture system at Stanford to record natural movements without markers.  We are extending computer vision algorithms, developed to model and track humans, to track freely moving monkeys.

To date we have investigated neural firing rates while a monkey performed various tasks such as walking on a treadmill, reaching for food, and sitting idly. We show that even in such an unconstrained and varied context, neural firing rates are well tuned to behaviour, supporting findings of basic neuroscience. Further, we demonstrate that the various behavioural tasks can be reliably classified, suggesting the viability of decoding techniques despite significant variation and environmental distractions associated with unconstrained behaviour.

The Perception of Natural Scenes in Motion

As we move through our visual environment, the spatial and temporal pattern of light that enters our eyes arises from the illumination of the scene, the properties of objects within the environment, their motions relative to each other, and our own motion relative to the external world. Quantifying the distributed neural representation of luminance and motion in the early visual pathway is a critical step in understanding how scene information is extracted and prepared for processing in higher visual centers. In collaboration with Jose-Manuel Alonso (SUNY) and Garrett Stanley (Georgia Tech), our goal is to model how the population activity of neurons in the visual thalamus represents properties of natural scenes.

In response to natural scene movies, we show that these exhibit a synchrony that is modulated by the properties of the scene. The occurrence of synchronous firing of cat thalamic cells with highly overlapping receptive fields is strongly sensitive to the orientation and the direction of motion of the visual stimulus. This stimulus selectivity is robust, remaining relatively unchanged under different contrasts and temporal frequencies (stimulus velocities). Computational modeling reveals that thalamic synchrony is a good predictor of scene motion and is effective for driving the firing of cortical neurons.


To probe the visual system we have developed a naturalistic, but artificially synthesized, movie dataset that captures the statistical properties associated with natural scenes in motion while providing ground-truth knowledge of important scene properties such as optical flow.  The stimuli are derived from a 3D animated film, which allows us to render scenes under conditions of varying complexity. Our goal is to understand the population response of thalamic neurons to the structure and motion of the scene.


Stanley, G. B., Jin, J., Wang, Y., Desbordes, G., Wang, Q., Black, M. J., and Alonso, J.-M. “Visual Orientation and Directional Selectivity Through Thalamic Synchrony.” J. Neuroscience, 2012, to appear.

Foster, J. D.,  Nuyujukian, P.,  Freifeld, O., Ryu, S., Black, M. J., Shenoy, K. V. “A framework for relating neural activity to freely moving behavior,” to appear: 34th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC'12), San Diego, Aug-Sept, 2012.

Vargas-Irwin, C. E.,  Shakhnarovich, G., Yadollahpour, P., Mislow, J.M.K., Black, M. J., Donoghue, J. P., “Decoding complete reach and grasp actions from local primary motor cortex populations.” J. Neuroscience, 30(29):9659-9669, July 21, 2010.