Presentation is loading. Please wait.

Presentation is loading. Please wait.

Authors: Peter W. Battaglia, Robert A. Jacobs, and Richard N. Aslin

Similar presentations


Presentation on theme: "Authors: Peter W. Battaglia, Robert A. Jacobs, and Richard N. Aslin"— Presentation transcript:

1 Authors: Peter W. Battaglia, Robert A. Jacobs, and Richard N. Aslin
Bayesian integration of visual and auditory signals for spatial localization Authors: Peter W. Battaglia, Robert A. Jacobs, and Richard N. Aslin COGS 272, Spring 2010 Instructor: Prof. Angela Yu Presenter: Vikram Gupta

2 Outline Introduction Background Methods Procedure Results Discussion

3 Introduction Spatial Localization is Complex
Integration of multiple sensory and motor signals Sensory: binaural time, phase, intensity difference Motor: orientation of the head

4 Introduction ∫ Inconsistent Spatial Cues
Typically, we receive consistent spatial cues What if this is not true? Ex: Movie theater, television Visual capture Vision dominates over conflicting auditory cue. Ex: recalibration in juvenile owl Optimal?

5 Background Models for inconsistent cue integration
Winner Take All (ex. vision capture) Dominant signal exclusively decides Blend information from sensory sources Is blending statistically optimal? Example: Maximum Likelihood Estimate Assumption independent sensory signals, normal dist.

6 Background MLE Example
Impact of reliability on MLE estimate

7 MLE Model Is Normal distribution a good estimate of neural coding of sensory input? Does this integration always occur? Or are there qualifying conditions? Does it make sense to integrate if Lv* and La* are far apart? v and a are temporally separated?

8 Schematic of MLE Integration
Ernst, 2006 (MLE integration for haptic and visual input

9 Experiment Vision capture or MLE match empirical data? Method summary:
Noise is produced at 1 of 7 locations 1.50 apart Visual stimulus has noise at 5 levels 10%, 23%, 36%, 49%, 62% Single sensory modality trial (Audio / noisy Visual )  MLE parameters  predict performance for Audio + noisy Visual  compare with Empirical data

10 Experiment Single-modality Bimodal
C Single-modality Standard stimuli followed by comparison Is C Left / Right of S? Bimodal Standard stimuli has Audio and Visual apart from center Audio and visual Comparison stimuli are co-located. Only 1 subject aware of spatial discrepancy in S

11 Results (1 subject) Cumulative normal distribution fits to data
Mean and variance are used for MLE model Wv receives high value when visual noise is low Wa receives high value when visual noise is high

12 Results (MLE Estimate of sensory input)
rt = 1 comparison to the right of standard pt = , probability of rt, given mean and variance R = set of responses to the independent trials Assuming normal distribution, MLE estimate of mean and variance parameters µml = 1/T * (∑ rt) σ2ml = 1/T * (rt - µml) 2

13 L* based on MLE estimates
Mean is calculated according to above weighted average Variance is smaller than either P(L|v) or P(L|a)

14 L* based on MLE estimates
MLE estimate for wv and wa are found by maximizing RHS of (3) and using (6) tau is scale parameter or slope

15 Results (bi-modal, same subject, all subjects)
Standard stimulus Visual Audio 1.50 Point of Subjective Equality for low visual noise 0.10 for high noise Visual input dominates at low noise Equal weight at high noise

16 Empirical vs. MLE MLE estimates for visual weight are significantly lower than the empirical results. A Bayesian model with a prior that reduces variance in visual-only trials provides a good regression fit for the data.

17 Bayesian (MAP) Cue Integration
For visual only trials, instead of using MLE for mean and variance, we multiply the RHS above with the probability of the occurrence of the normal distribution mean is assumed to have a uniform distribution. variance is assumed to have inverse gamma distribution with parameters biased for small variance.

18 Discussion Bayesian approach is a hybrid of MLE and visual capture models. How are variances encoded? How are priors encoded? How does temporal separation in cues impact sensory integration? Biological basis for Bayesian cue integration?


Download ppt "Authors: Peter W. Battaglia, Robert A. Jacobs, and Richard N. Aslin"

Similar presentations


Ads by Google