Work by KAndinsky

Workshop: Cognitive and Cross-Modal Effects on Vision

 

26 and 27 March 20011

 

Caird Room, Department of Philosophy, 69 Oakfield Avenue, University of Glasgow

 

Find us on a Univeristy map and Google maps.

 

Programme:
Saturday

9.00 - 9.30 Registration

9.30 - 11.00 Dustin Stokes (Philosophy, Toronto)

"Desire, value, and experience"

Abstract: If the cognitive penetration of perceptual experience is an actual and sufficiently frequent phenomenon, it has important consequences. Some of these consequences are familiar ones from the literature—consequences for architectures of mind, theories of epistemic justification, and rational scientific theory choice. Other consequences are less familiar: for example, there are possible consequences for experience and proper evaluation of artworks. Many interpretive strategies are invoked to resist putative cases of cognitive penetration. We can take these criticisms seriously and define the phenomenon accordingly, such that any case that meets the definition cannot be (or is less plausibly) interpreted as the critics suggest.  One can then simply ask whether there are cases that meet the definition. I will consider two sets of studies, one very old and one very recent, that plausibly satisfy the conditions of the provided definition. These cases, I argue, involve the penetration of experience by value or desire. A few alternative interpretive strategies may remain but, even granting this, these (and other) cases matter no less to the consequences agreed upon by both critics and friends of cognitive penetrability theses.

11.00 – 11.30 Coffee

11.30 - 1.00 Petra Vetter (Psychology, Glasgow)

"Decoding sound and imagery content in early visual cortex"

Abstract: Traditionally, early visual cortex has been thought of being implicated mainly in the processing of very basic visual features such as edges and contours. Here I present functional neuroimaging data showing that early visual cortex contains content-specific information from hearing and from mental imagery. Even in the absence of any visual stimulation, different types of natural sounds induce distinguishable activity patterns in early visual cortex. The same is true when sounds and their corresponding visual scenes are merely imagined, again in the absence of external sensory information. This evidence suggests that early visual cortex receives information from other sensory modalities and higher cortical areas and that it represents the differential content of this information. I will discuss this evidence within the framework of prediction.

1.00 - 3.00 Lunch - Stravaigin

3.00 – 4.30 Ophelia Deroy (Institut Jean Nicod & University Paris Est)

"Concept-dependence or object-sensitivity of perception? Coloured shapes and other cases"

Abstract: Having learnt that hearts are red, and bananas yellow, we are more ready to perceive heart shapes as red and banana shapes as yellow (Delk and Fillenbaum, 1965 ; Hansen et al. 2006 ; Olkkonen at al., 2008). These  effects on perception are difficult to measure and to explain. They vary from one object to another: in tested conditions, hearts and apples will look more red, but squares reflecting the same wavelength won’t. Does this “object-sensitivity” of colour perception suggest that the effect is shaped from outside, by higher cognitive processes where the information about shapes and colours are stored together? I discuss these cases and other examples of what I call the "object-sensitivity" of perception. Contrary to Siegel or Macpherson, I don't think that these cases give us conclusive reasons to say that perception is cognitively penetrable. I stress a series of difficulties for the later interpretation and offer an alternative account in terms of "semantic effects" which helps us understand the nature of multi-sensory integration. I show how this account differs, from a conceptual and experimental perspective, from the model of cognitive penetration, without simply going back to Pylyshyn's model of modularity.

4.30 - 5.00 Coffee

5.00  - 6.30 Erik van der Burg (Psychology, Vrije Universiteit Amsterdam)

"Multisensory synchrony guides attention in dynamic cluttered environments"

Abstract: Visual attention is readily drawn to visual objects that stand out from the background, such as a red object among green objects. When such clear bottom-up signals are absent, top-down control may play a larger role, such that knowledge on the visual properties relevant to the task determine which object is selected. In the present study we show that a signal that is neither low-level visual, nor provides any top-down knowledge on the location or identity of the visual target object, still affects the selection of that object. We will demonstrate that a non-spatial auditory event (a “pip”) can guide attention towards the location of a synchronized visual event that, without such an auditory signal, is very hard to find. Phenomenally, the pip makes the visual target pop out from its complex environment. We further investigated this “pip and pop” effect using different methodologies (ERPs, Accuracy, RTs, Temporal Order Judgments). Here we present the results, which suggest that synchronized audiovisual events guide attention due to early multisensory integration, and that this effect occurs even when the audiovisual events are irrelevant to the task. Moreover, we show that tactile-visual synchrony too leads to more efficient search. We conclude that multisensory synchrony guides attention in a stimulus-driven fashion.

Dinner - Ubiquitous Chip

Sunday

9.30 – 11.00 Wayne Wu (Center for the Neural Basis of Cognition and Philosophy, Carnegie Mellon)

"Intention (Pervasively) Cognitively Penetrates Visual Experience"

Abstract: I will argue that intentions cognitively penetrate visual experience because they explain position constancy during intention-modulated saccadic eye movements (average: 3 times per second). In the bulk of the talk, I focus on the phenomenon of visual position constancy, a pervasive feature of conscious vision. While the phenomenon has attracted such luminaries as Descartes and von Helmholtz and recent primate electrophysiology has elucidated relevant facets of primate visual spatial representation giving new impetus to work on the problem, we still lack a satisfying explanation. I shall explain precisely what the phenomenon involves, argue that we need to impose the notion of spatial reference frames to adequately explain the phenomenon. I then argue that contrary to Milner and Goodale’s account, the dorsal stream is involved in certain aspects of conscious vision and thus can provide the requisite spatial information to explain constancy. I then present a new account of position constancy that draws on spatial information widely acknowledged as present in primate cortical vision, specifically in the dorsal stream. Finally, I argue that many cases of position constancy during saccades are cases where intention cognitively penetrates conscious vision, indeed pervasively so.

11.00 - 11.30 Coffee

11.30 – 1.00 Charles Spence (presenting work by himself and Yi-Chuan Chen) (Crossmodal Research Laboratory, Department of Experimental Psychology, Oxford)

"The crossmodal facilitation of vision by audition: How and when?"

Abstract: In this talk, we will review the latest evidence from recent studies conducted at the Crossmodal Research Laboratory here in Oxford demonstrating that a person’s ability to detect, discriminate, and identify a visual target can be enhanced by the simultaneous (or near-simultaneous) presentation of an auditory event. Evidence will be presented from studies that have utilized a variety of different psychophysical techniques, including the pip-and-pop task, binocular rivalry, backward masking, etc. We will examine the crossmodal facilitation that results from the presentation of both meaningless stimuli (e.g., beeps and white noise bursts) as well as semantically meaningful stimuli, such as object sounds (e.g., the sound of a woofing dog) and words. We will describe our latest results showing that object sounds and the words used to describe those objects (e.g., the word “dog”) when presented auditorily appear to crossmodally impact visual perception in different ways. These results will be explained with reference to the type/token account of event perception.

Lunch - The Left Bank


This workshop is jointly organised by (Glasgow) and Athanassios Raftopoulos (Cyprus) under the auspices of the Centre for the Study of Perceptual Experience, University of Glasgow and CenSes: Centre for the Study of the Senses, Institute of Philosophy, School of Advanced Study, University of London.

The workshop is partly funded by the Mind Association, the Scots Philosophical Association and the British Society for the Philosophy of Science.

If you would like to attend the workshop please e-mail Fiona Macpherson. The fee to attend is £40 (£20 for graduate students). If you would like to attend the dinner on the Saturday evening or the lunches this will be extra. Please register by 16 March 2011.



Description of the Topic

The topic of this workshop is cognitive and cross modal effects on vision, particularly visual experience. Traditionally, it was thought that visual processing and visual experience was unaffected by a subjects’ beliefs, desires, emotions and other perceptual experiences. Such a model was attractive as it promised that perception could provide a source of evidence about the world untainted by one’s beliefs or theories that one held about the world or by one’s emotions or desires.


However, in recent years much work has been done on the nature of visual processing by vision scientists, psychologists, and neuroscientists and several top-down effects on visual processing, such as its timing and its modulation by attention are well substantiated (Raftopoulos, 2009). Similarly, much scientific research on cross modal effects suggests that the processing streams from different senses interact much more than was traditionally thought. For example, the surprising McGurk illusion demonstrates that what one hears can be affected by what one sees. (McGurk and MacDonald, 1976)


This research raises several substantial questions that can be addressed by both philosophers and scientists, and will be the focus of this workshop.

  1. Do these effects constitute evidence in support of the claim that “cognitive penetration” occurs (Pylyshyn, 1999)? This is the claim that the content of perceptual states and perceptual experience is determined, at least partly, by the content of our cognitive states, such as beliefs, desires and expectations, such that the change in the content of the perceptual experience is made intelligible, or in some very minimal sense rational, in light of the content of the cognitive state.
  2. Most of the scientific research has focused on the effects of cognition on visual processing via the attentional modulation of visual processing. But precisely whether this sort of effect amounts to cognitive penetration is a key question. It remains to be seen whether cognition could modulate perceptual processing in another non-attentional way. There has been a recent flurry of philosophical work on this topic, e. g. Macpherson (forthcoming), Raftopoulos (2009), Stokes (forthcoming).
  3. It is known of course that emotions can affect perception but until recently it was thought that emotional effects are only mediated through attention (for example emotions capture attention or make us inattentive). However, recent work by Vuilleumier and Driver (2007) provides evidence for emotional effects on visual processing that are not mediated by attention. Does this provide us with a mechanism by which cognitive factors could influence perception?
  4. The more general question of what mechanisms may be responsible for cognitive penetration will also be addressed. Are there more than one and what do these tell us about the phenomenon and the types of interaction between experience and cognitive states?
  5. A further concern is about the nature of perceptual content. If one thinks that the content of perception is conceptual does that commit one to the existence of cognitive penetration or vice versa?
  6. The final questions concern cross-modal interactions. If one has an experience that is caused in part by the interaction of two senses, in what circumstances should we say that the experience is in one or other of the senses, or in both, or in neither? And are there any limits to the extent of cross-modal interaction?


These questions are important not only for the goal of understanding the nature of the mind but also because which answers we provide may have substantial moral and epistemological consequences. The epistemic issues can be seen by considering whether if your experience tells you that a an object is present, but unbeknown to you your experience is affected by your beliefs about that object, to what extent are you justified in forming the belief that the object is there? (See Siegel (forthcoming),) The moral issues arise from the fact that people’s perceptual experiences seem skewed by cognitive factors of which they are unaware and which are beyond their control, and these experiences can affect their behaviour. For example, Plant and Peruche (2005) discovered perception, which may have been caused by their beliefs affecting their experience.


In this workshop we will focus on the six questions above – that is establishing whether cognitive penetration occurs and, if so, in what circumstances and by what mechanisms, and investigating the scope and nature of cross-modal effects.

References


Chen, Y.-C., & Spence, C. (2010) "When hearing the bark helps to identify the dog: Semantically-congruent sounds modulate the identification of masked pictures", Cognition, 114, 389-404.

Spence, C., & Ngo, M.-C. (in press) "Does attention or multisensory integration explain the crossmodal facilitation of masked visual target identification?", in B. E. Stein (ed.), The New Handbook of Multisensory Processing. Cambridge, MA: MIT Press.

Chen, Y.-C., & Spence, C. (submitted) "The crossmodal facilitation of visual object representations by sound: Evidence from the backward masking paradigm", Journal of Experimental Psychology: Human Perception and Performance.

Chen, Y.-C., & Spence, C. (submitted) "Auditory semantic priming of visual picture detection", Journal of Experimental Psychology: Human Perception and Performance.

Chen, Y.-C., & Spence, C. (submitted) "Multiple levels of modulation by naturalistic sounds and spoken words on visual picture categorization", Psychonomic Bulletin & Review.

Macpherson, F. (forthcoming) “Cognitive Penetration of Colour Experience: Rethinking the Issue in Light of an Indirect Mechanism”, Philosophy and Phenomenological Research.

McGurk, H & MacDonald, J (1976); "Hearing lips and seeing voices," Nature, 264: 746–748.

Plant E. A. and Peruche, B. M. (2005) “The consequences of race for police officers' responses to criminal suspects”, Psychol Sci. 16(3):180-3.

Pylyshyn, Z. W. (1999) “Is Vision Continuous with Cognition? The Case for Cognitive Impenetrability of Visual Perception,” Behavioral and Brain Sciences, 22: 341-423.

Raftopoulos, A. (2005) “Cognitive Penetrability of Perception: A New Perspective”, in A. Raftopoulos (ed.) Cognitive Penetrability of Perception: Attention, Action, Strategies, and Bottom-Up Constraints, Hauppauge, NY: Nova Science.

Raftopoulos, A. (2009). Cognition and Perception: How do Psychology and the Neural Sciences Inform Philosophy. MIT Press.

Siegel, S. (forthcoming)“Cognitive Penetrability and Perceptual Justification”, Noûs

Stokes, D. (forthcoming) “Perceiving and Desiring: A New Look at the Cognitive Penetrability of Experience”, Philosophical Studies.

Vuilleumier, P. &  Driver, J. (2007) “Modulation of visual processing by attention and emotion: windows on causal interactions between human brain regions”, Philos Trans R Soc Lond B Biol Sci., 29; 362(1481): 837–855).