Abstracts

Simon Prosser

Rethinking the Specious Present

A significant majority of philosophers currently thinking about temporal experience accept some version of the doctrine of the specious present, which I take to be the view that experiences have temporally extended contents. But there are surprisingly few arguments for this, and those that are most commonly put forward are flawed. I illustrate this by showing how motion experience can be explained in terms of what I call the dynamic snapshot theory, according to which experience has an instantaneous content that includes motion. However this theory seems less able to cope with experiences of discontinuous change, where something is in one state at times up to and including t, but in a different state at all times thereafter. I propose a separate model for the experience of discontinuous change that still avoids the specious present. In the end, however, I suggest that debates over the specious present only make good sense given some Cartesian assumptions about experience that should be rejected. Having rejected these assumptions we are left with a view according to which it makes no sense to ask what the subject experiences at a time, rather than over a period of time, but a view that is importantly different from the ‘extensional’ version of the specious present.

Jean Vroomen

Adaptation to Intersensory Conflict in Speech, Space, and Time

For most multisensory events, observers perceive unity and synchrony between the different senses (vision, audition, touch), despite naturally occurring lags in arrival and processing times and/or incongruent content in one of the information streams. A substantial amount of research has examined how the brain accomplishes this. In my talk, I will review several key findings from research in audiovisual speech, space and time on how the brain adapts to these intersensory conflicts.

Vanessa Harrar and Laurence Harris

Multisensory Synchrony in Perception and Action

Information about an event takes different amounts of time to be processed depending on which sensory system the event activates. What are the consequences of auditory, visual, and tactile neural processing time differences when combining multisensory information? Visual information about events on the body reaches the brain at a time that is independent of the location of the event, while tactile information about such events takes different amounts of time to be processed depending on the distance between the stimulated surface and the brain. Similarly, sounds take longer to reach the observer, and thus reach the brain, when they originate at an event that is farther away. We’ll discuss how we measure the perception of simultaneity and I’ll provide data that demonstrates that, despite the variations in processing time for lights and sounds, the point of subjective simultaneity (PSS) for multisensory stimuli is usually close to true simultaneity. I’ll compare reaction times to perception of simultaneity and suggest the simultaneity constancy mechanism. We’ll discuss different features of the simultaneity constancy mechanism with data from a variety of adaptation experiments. I will explain how multisensory integration can be demonstrated by violations of the statistical summation of reaction times. We’ll discuss how statistical summation can be a useful tool for comparing multisensory integration across populations, as well as in different cognitive states. While temporal adaptation alters synchrony perception, multisensory facilitation of responses only occurs when audiovisual stimuli are physically synchronous, rather than when they appear synchronous. From this I suggest separate multisensory integration mechanisms for perception and action.

Christoph Hoerl

On the Idea that there is No Sense Organ for Time

In the literature on temporal experience, it is sometimes claimed that “there is no sense organ for time”. In this talk, I want to examine what might be meant by this, and what motivation there might be for making such a claim. On possible motivation for it might come from the idea, familiar at least since Locke, that our primary source of awareness of time is introspection, rather than exteroception. After casting some doubt on this idea, I suggest that what motivates the claim is in fact a particular feature of the phenomenology of perceptual experience, the feature that also underlies the intuition that there is an intimate connection between something present in perceptual experience and that thing’s being temporally present.

Lars Muckli

Visual Predictions in Different Layers of Visual Cortex

Normal brain function involves the interaction of internal processes with incoming sensory stimuli. We have created a series of brain imaging experiments that sample internal models and feedback mechanisms in early visual cortex. Primary visual cortex (V1) is the entry-stage for cortical processing of visual information. We can show that there are two information counter-streams concerned with: (1) retinotopic visual input and (2) top-down predictions of internal models generated by the brain. Our results speak to the conceptual framework of predictive coding. Internal models amplify and disamplify incoming information. The brain is a prediction-machinery. Healthy brain function will strike a balance between precision of prediction and prediction update based on prediction error. Our results incorporate state of the art, layer-specific ultra-high field fMRI and other imaging techniques.