Colin Blakemore (School of Advanced Study, University of London)
‘What has science ever done for perception?’
Sensory physiology and psychology have made important contributions to our understanding of perception, but there remain deep unsolved questions. I shall start with two assumptions: 1) the high metabolic cost of impulses puts value on elimination of redundancy and on sparse coding; 2) genetic mechanisms and experience-dependent plasticity contribute to the creation of stimulus selectivity in sensory neurons. These principles are essential the same as two of the dogmas of Barlow’s (1972) classic neuron doctrine for perceptual psychology. However, I shall question two other dogmas of Barlow—that impulse rate indicates only the certainty that the preferred feature is present; and that the content of perception is not dependent on combinatorial rules of usage of nerve cells.
The encoding properties of low-level sensory neurons are always are essentially ambiguous, in the sense that variation of the stimulus along many dimensions leads to variation in the probability of impulses. Hence there is no feature whose presence could be unambiguously signalled by impulses in such a neuron. It follows that disambiguation involves comparison of signals in several neurons, either explicitly, through convergence on to a shared target neuron, of by some other mechanism for comparing the activity of different neurons. I shall argue that one of the functions of non-primary sensory areas in the cortex of higher mammals is to make explicit the spatial and temporal relationships between the activity of separate neurons in the primary area. An interesting question is whether all the discriminable aspects of a sensory experience correspond to such explicit encoding.
Sensory receptors, and the central neurons that receive signals from them, have evolved to provide organisms with information that is useful in guiding their behaviour. ‘Information’ must be defined not only mathematically, but it terms of behavioural significance. Any biological interpretation of sensory processing would, then, emphasise the value of its conclusions—reverse-engineered, inductive inferences about the nature of things and events in the outside world (and within the animal). In those terms it is not at all surprising that perception is multisensory: integration of information from different senses can provide more reliable evidence about the location and nature of events. What is surprising is that we are aware of the raw, unimodal elements of experience (Block’s phenomenal consciousness; qualia), which don’t seem to matter in terms of behaviour. If I spot a ripe strawberry and pick it, what matters is the knowledge that it is a strawberry and that it is ripe. Activation of red-catching cones in my retina helps me with that identification, but why do I need to experience the redness directly?
What is even more remarkable, given the evident mingling of sensory information, is how robust the phenomenal awareness of the modality of experiences seems to be. Recently, Yuanyuan Zhao and I have been testing the apparent robustness of the modal ‘tagging’ of sensory experience. In detection tasks in which the stimulus is either a flash of light or a sound beep, in random sequence, there is, surprisingly, no difference in threshold or certainty of awareness for each modality compared with control experiments with stimuli of only one modality. And, even more surprising, the modality can be identified more reliably than the stimulus can be detected. Observers are remarkably reliable at identifying the modality of a stimulus, even when they are unsure that they have had a sensory experience.
From a functional perspective, why should I be so distinctly aware that the shape, colour and location of a strawberry are visual experiences, while its odour is an olfactory experience, when both help me to identify it as a strawberry?
Even though different components of a visual experience (shape, colour, motion, depth etc) seem to depend on explicit representation in different regions of extrastriate visual cortex, they all share the same modal sense of being visual. Since virtually all visual signals arrive at the primary visual cortex (V1) and are then distributed to extrastriate regions, is it possible that the awareness of modality ‘arises’ in V1, even if the content of experience depends on explicit representation in the activity of neurons outside V1.
- Barlow, H. B. (1972), ‘Single units and sensation: a neuron doctrine for perceptual psychology’, Perception 1: 371–94
- Pennartz, C. M. A. (2009), ‘Identification and integration of sensory modalities: neuronal basis and relation to consciousness’, Consciousness & Cognition 18: 718–93