Prof. Shinsuke Shimojo

Prof. Shimojo
ISP2012 Lecture Course Abstract:

Crossmodal Interactions – Attention and synchrony

Crossmodal interactions are important because they provide bases of adaptive behaviors in the daily environment. More theoretically, they are critical for a better understanding of interplays between top-down vs. bottom, conscious vs. subconscious, and sensory vs. motor processes. In this lecture, I will raise over a dozen of big questions about crossmodal interactions (as listed below), to which I can find at least partial empirical answers from my own and other laboratories. They will provide us with a balanced overview of this quickly expanding field on one hand, and locate it in broader biological (i.e. Nature vs. Nurture) as well as computational (i.e. Modularity notion) contexts.

<14 Big Questions (& partial answers) about Crossmodal Interactions>
Q1. Are sensory modalities segregated both anatomically and functionally?
A: Not completely. More vigorous interactions at earlier levels than what used to be believed.
Q2. Are their relationship (connections) flexible?
A: Very flexible, as indicated in early sensory plasticity in animals, as well as sensory substitution studies in humans.
Q3. Is there single clock across sensory modalities, or rather multiple clocks?
A: Evidence for both, depending on what psychophysical paradigm to use.
Q4. Does vision affect other modalities?
A: Of course yes, with the McGurk and the Ventriloquism effects as classical examples, which has led to the common notion that the human is a vision-dominant species.
Q5. Can other modality affect vision? For example, can audition affect vision?
A: According to the latest findings, yes. It necessitates some theoretical modification on the above-mentioned notion that the human is vision- dominant.
Q6. Can audition affect vision, not only quantitatively, but also qualitatively (i.e. in structure of percept)?
A: Yes, the "double flash" illusion would be the strongest evidence.
Q7. Is this auditory "capture" of visual percept really due to early sensory pathways (as opposed to cognitive or selection bias)?
A: Yes, according to various psychophysical control experiments and fMRI/EEG evidence.
Q8. Can a role of one modality in crossmodal interaction be replaced by another modality?
A: This may sound like an odd question, but the answer is mostly yes, and it points to the generality of crossmodal integration mechanisms.
Q9. Can crossmodal synchrony/temporal order modifiable?
A: Yes, via sustained adaptation/aftereffect.
Q10. Are transient signals critical for crossmodal integration(, thus this is what common among cross- and within-modal cases)?
A: Seemingly yes, related to Q8 above.
Q11. What determines ambiguity solving in crossmodal perception?
A: Attention and timing (synchrony) are critical.
Q12. Can crossmodal adaptation/aftereffect occur without contingent exposure?
A: Yes, according to the crossmodal temporal rate adaptation findings.
Q13. Computationally, can the Bayesian (maximum likelihood) model explain everything?
A: Mostly, but not quite. The above-mentioned "double flash" (Q6) and the non-contingent adaptation (Q12) may be notable exceptions.
Q14. Does crossmodal integration provide a perceptual basis for perceptual metaphor (and thus, language)?
A: Yes, there are accumulated pieces of evidence for intrinsic crossmodal mapping, which in turn provides a basis for spatial metaphor (such as "up" and "down" both in spatial and tonal domains).
Prof. Shimojo's ISP2012 Symposium Talk Abstract
Prof. Shimojo's ISP2012 Profile
Prof. Shimojo's Caltech Profile

ISP2012 Abstracts