Skip to main content
Article
Detecting correlations between auditory and visual signals
Journal of Vision (2007)
  • Pei-Yi Ko
  • Carmel Levitan, Occidental College
  • Martin Banks
Abstract
In combining information from multiple sources, the brain must determine which signals correspond. For instance, in a crowded room, there may be many people speaking at once, but the brain correctly determines which speaker's lip movements match which sound. To examine the ability to detect correlations between auditory and visual stimuli, we presented auditory-visual stimulus pairs that contained correlated and uncorrelated changes over time. The visual stimuli were modulated in size and the auditory stimuli were modulated in intensity. We used a two-interval, forced-choice procedure to measure correlation-detection thresholds. In the signal interval, the amplitude modulations contained a correlated component. In the no-signal interval, the modulations were uncorrelated. Observers indicated which of two intervals on each trial contained the correlated modulations. To find the correlation-detection threshold, we varied the proportion of correlated and uncorrelated modulation in the signal interval. In one experiment, we varied the temporal frequency of the amplitude modulation by band-passing filtering the modulation waveforms. Correlation detection was good (threshold was ∼0.2) for temporal frequencies of 0.5–2 Hz and then deteriorated at progressively higher frequencies. This suggests that the mechanisms involved in detecting auditory-visual correlations are sluggish. In another experiment, we presented broad-band stimuli and varied the temporal lag between the auditory and visual stimuli. Correlation-detection threshold was roughly constant for lags of +/−200 msec and was elevated about two-fold for lags of +/−400 msec. There was no obvious asymmetry in this lag effect. Thus, the mechanisms involved in detecting auditory-visual correlations tolerate fairly substantial time offsets. In analogy to models of stereo correspondence, we developed an auditory-visual cross-correlator and found that its properties are similar to those observed experimentally.
Publication Date
June 30, 2007
Citation Information
Pei-Yi Ko, Carmel Levitan and Martin Banks. "Detecting correlations between auditory and visual signals" Journal of Vision Vol. 7 Iss. 9 (2007)
Available at: http://works.bepress.com/carmel_levitan/14/