Bayesian Perception.

Slides:



Advertisements
Similar presentations
What is the neural code? Puchalla et al., What is the neural code? Encoding: how does a stimulus cause the pattern of responses? what are the responses.
Advertisements

Rutgers CS440, Fall 2003 Review session. Rutgers CS440, Fall 2003 Topics Final will cover the following topics (after midterm): 1.Uncertainty & introduction.
Motion Illusions As Optimal Percepts. What’s Special About Perception? Arguably, visual perception is better optimized by evolution than other cognitive.
Sampling Distributions (§ )
Chapter 4: Linear Models for Classification
Artificial Spiking Neural Networks
Population Codes & Inference in Neurons
BCS547 Neural Encoding.
Oklahoma State University Generative Graphical Models for Maneuvering Object Tracking and Dynamics Analysis Xin Fan and Guoliang Fan Visual Computing and.
Visual Recognition Tutorial
Reading population codes: a neural implementation of ideal observers Sophie Deneve, Peter Latham, and Alexandre Pouget.
How well can we learn what the stimulus is by looking at the neural responses? We will discuss two approaches: devise and evaluate explicit algorithms.
For stimulus s, have estimated s est Bias: Cramer-Rao bound: Mean square error: Variance: Fisher information How good is our estimate? (ML is unbiased:
Paper Discussion: “Simultaneous Localization and Environmental Mapping with a Sensor Network”, Marinakis et. al. ICRA 2011.
PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2004 Vision as Optimal Inference The problem of visual processing can be thought of as.
Decision making as a model 7. Visual perception as Bayesian decision making.
Neural Networks Marco Loog.
PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Making Decisions: Modeling perceptual decisions.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Modeling fMRI data generated by overlapping cognitive processes with unknown onsets using Hidden Process Models Rebecca A. Hutchinson (1) Tom M. Mitchell.
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Particle Filtering in Network Tomography
How do neurons deal with uncertainty?
1 Computational Vision CSCI 363, Fall 2012 Lecture 26 Review for Exam 2.
Population Coding Alexandre Pouget Okinawa Computational Neuroscience Course Okinawa, Japan November 2004.
EM and expected complete log-likelihood Mixture of Experts
ECE 8443 – Pattern Recognition LECTURE 03: GAUSSIAN CLASSIFIERS Objectives: Normal Distributions Whitening Transformations Linear Discriminants Resources.
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
STUDY, MODEL & INTERFACE WITH MOTOR CORTEX Presented by - Waseem Khatri.
2 2  Background  Vision in Human Brain  Efficient Coding Theory  Motivation  Natural Pictures  Methodology  Statistical Characteristics  Models.
1 Computational Vision CSCI 363, Fall 2012 Lecture 31 Heading Models.
Neural coding (1) LECTURE 8. I.Introduction − Topographic Maps in Cortex − Synesthesia − Firing rates and tuning curves.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 11: Bayesian learning continued Geoffrey Hinton.
Particle Filters.
Empirical Research Methods in Computer Science Lecture 7 November 30, 2005 Noah Smith.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Motor Control. Beyond babbling Three problems with motor babbling: –Random exploration is slow –Error-based learning algorithms are faster but error signals.
Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher.
Lecture 21 Neural Modeling II Martin Giese. Aim of this Class Account for experimentally observed effects in motion perception with the simple neuronal.
BCS547 Neural Decoding. Population Code Tuning CurvesPattern of activity (r) Direction (deg) Activity
BCS547 Neural Decoding.
An Introduction to Kalman Filtering by Arthur Pece
Lecture 2: Statistical learning primer for biologists
Motion Illusions As Optimal Percepts. What’s Special About Perception? Visual perception important for survival  Likely optimized by evolution  at least.
Probability and Statistics in Vision. Probability Objects not all the sameObjects not all the same – Many possible shapes for people, cars, … – Skin has.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Ch.9 Bayesian Models of Sensory Cue Integration (Mon) Summarized and Presented by J.W. Ha 1.
OBJECT TRACKING USING PARTICLE FILTERS. Table of Contents Tracking Tracking Tracking as a probabilistic inference problem Tracking as a probabilistic.
Tracking with dynamics
6. Population Codes Presented by Rhee, Je-Keun © 2008, SNU Biointelligence Lab,
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Ch3: Model Building through Regression
Maximum Likelihood Estimation
Probabilistic Robotics
Synapses Signal is carried chemically across the synaptic cleft.
Spring 2018 Professor Michael Mozer
Xaq Pitkow, Dora E. Angelaki  Neuron 
Measuring motion in biological vision systems
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Satoru Suzuki, Marcia Grabowecky  Neuron 
10701 / Machine Learning Today: - Cross validation,
Confidence as Bayesian Probability: From Neural Origins to Behavior
Satoru Suzuki, Marcia Grabowecky  Neuron 
Parametric Methods Berlin Chen, 2005 References:
Information Processing by Neuronal Populations Chapter 5 Measuring distributed properties of neural representations beyond the decoding of local variables:
Sampling Distributions (§ )
Volume 74, Issue 1, Pages (April 2012)
Chapter 14 February 26, 2004.
Presentation transcript:

Bayesian Perception

General Idea Perception is a statistical inference The brain stores knowledge about P(I,V) where I is the set of natural images, and V are the perceptual variables (color, motion, object identity) Given an image, the brain computes P(V|I)

General Idea Decisions are made by collapsing the distribution onto a single value: or

Key Ideas The nervous systems represents probability distributions. i.e., it represents the uncertainty inherent to all stimuli. The nervous system stores generative models, or forward models, of the world (e.g. P(I|V)). Biological neural networks can perform complex statistical inferences.

A simple problem Estimating direction of motion from a noisy population code

Population Code Tuning Curves Pattern of activity (A)

Maximum Likelihood

Maximum Likelihood The maximum likelihood estimate is the value of q maximizing the likelihood P(A|q). Therefore, we seek such that: is unbiased and efficient. Likelihood function Noise distribution

MT V1

Preferred Direction MT V1 Preferred Direction

Linear Networks Networks in which the activity at time t+1 is a linear function of the activity at the previous time step.

Linear Networks Equivalent to population vector

Nonlinear Networks Networks in which the activity at time t+1 is a nonlinear function of the activity at the previous time step.

Preferred Direction MT V1 Preferred Direction

Maximum Likelihood

Standard Deviation of

Standard Deviation of

Weight Pattern Amplitude Difference in preferred direction

Performance Over Time

General Result Networks of nonlinear units with bell shaped tuning curves and a line attractor (stable smooth hills) are equivalent to a maximum likelihood estimator regardless of the exact form of the nonlinear activation function.

General Result Pro: Maximum likelihood estimation Biological implementation (the attractors dynamics is akin to a generative model ) Con: No explicit representations of probability distributions No use of priors

Motion Perception

The Aperture Problem The aperture in itself introduces uncertainty

The Aperture Problem

The Aperture Problem

The Aperture Problem

The Aperture Problem

The Aperture Problem

The Aperture Problem

The Aperture Problem

The Aperture Problem

The Aperture Problem

The Aperture Problem

The Aperture Problem

The Aperture Problem Vertical velocity (deg/s) Horizontal velocity (deg/s)

The Aperture Problem Vertical velocity (deg/s) Horizontal velocity (deg/s)

The Aperture Problem

The Aperture Problem Vertical velocity (deg/s) Horizontal velocity (deg/s)

The Aperture Problem Vertical velocity (deg/s) Horizontal velocity (deg/s)

Standard Models of Motion Perception IOC: interception of constraints VA: Vector average Feature tracking

Standard Models of Motion Perception IOC VA Vertical velocity (deg/s) Horizontal velocity (deg/s)

Standard Models of Motion Perception IOC VA Vertical velocity (deg/s) Horizontal velocity (deg/s)

Standard Models of Motion Perception IOC VA Vertical velocity (deg/s) Horizontal velocity (deg/s)

Standard Models of Motion Perception IOC VA Vertical velocity (deg/s) Horizontal velocity (deg/s)

Standard Models of Motion Perception Problem: perceived motion is close to either IOC or VA depending on stimulus duration, eccentricity, contrast and other factors.

Standard Models of Motion Perception Example: Rhombus Percept: IOC Percept: VA IOC IOC VA VA Vertical velocity (deg/s) Vertical velocity (deg/s) Horizontal velocity (deg/s) Horizontal velocity (deg/s)

Bayesian Model of Motion Perception Perceived motion correspond to the MAP estimate

Prior Human observers favor slow motions Rotating wheel -50 50 Horizontal Velocity Vertical Velocity Rotating wheel Switching dot patterns

Likelihood Weiss and Adelson -50 50 Horizontal Velocity 50 Horizontal Velocity Vertical Velocity

Likelihood

Likelihood

Bayesian Model of Motion Perception Perceived motion correspond to the MAP estimate

Motion through an Aperture Humans perceive the slowest motion

Motion through an Aperture Likelihood 50 Vertical Velocity -50 -50 50 ML Horizontal Velocity 50 50 Vertical Velocity Vertical Velocity MAP -50 -50 Prior Posterior -50 50 -50 50 Horizontal Velocity Horizontal Velocity

Motion and Constrast Humans tend to underestimate velocity in low contrast situations

Motion and Contrast High Contrast Likelihood ML MAP Prior Posterior 50 Vertical Velocity High Contrast -50 -50 50 ML Horizontal Velocity 50 50 Vertical Velocity Vertical Velocity MAP -50 -50 Prior Posterior -50 50 -50 50 Horizontal Velocity Horizontal Velocity

Motion and Contrast Low Contrast Likelihood ML MAP Prior Posterior 50 Vertical Velocity Low Contrast -50 -50 50 ML Horizontal Velocity MAP 50 50 Vertical Velocity Vertical Velocity -50 -50 Prior Posterior -50 50 -50 50 Horizontal Velocity Horizontal Velocity

Motion and Contrast Driving in the fog: in low contrast situations, the prior dominates

Moving Rhombus High Contrast Likelihood IOC MAP Prior Posterior 50 50 Vertical Velocity Vertical Velocity High Contrast -50 -50 -50 50 -50 50 IOC Horizontal Velocity Horizontal Velocity 50 50 MAP Vertical Velocity Vertical Velocity -50 -50 Prior -50 50 -50 50 Posterior Horizontal Velocity Horizontal Velocity

Moving Rhombus Low Contrast Likelihood IOC MAP Prior Posterior 50 50 Vertical Velocity Vertical Velocity -50 -50 Low Contrast -50 50 -50 50 Horizontal Velocity Horizontal Velocity IOC 50 50 MAP Vertical Velocity Vertical Velocity -50 -50 -50 50 -50 50 Prior Posterior Horizontal Velocity Horizontal Velocity

Moving Rhombus

Moving Rhombus Example: Rhombus Percept: IOC Percept: VA IOC IOC VA VA Vertical velocity (deg/s) Vertical velocity (deg/s) Horizontal velocity (deg/s) Horizontal velocity (deg/s)

Barberpole Illusion

Plaid Motion: Type I and II

Plaids and Contrast

Plaids and Time Viewing time reduces uncertainty

Ellipses Fat vs narrow ellipses

Ellipses Adding unambiguous motion

Biological Implementation Neurons might be representing probability distributions How?

Biological Implementation Encoding model

Biological Implementation Decoding Linear decoder: deconvolution

Biological Implementation Decoding: nonlinear Represent P(V|W) as a discretized histogram and use EM to evaluate the parameters