PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Making Decisions: Modeling perceptual decisions.

Slides:



Advertisements
Similar presentations
Introduction to Graphical Models Brookes Vision Lab Reading Group.
Advertisements

Bayesian Belief Propagation
Dynamic Bayesian Networks (DBNs)
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Oklahoma State University Generative Graphical Models for Maneuvering Object Tracking and Dynamics Analysis Xin Fan and Guoliang Fan Visual Computing and.
Visual Recognition Tutorial
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Region labelling Giving a region a name. Image Processing and Computer Vision: 62 Introduction Region detection isolated regions Region description properties.
PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2004 Vision as Optimal Inference The problem of visual processing can be thought of as.
Decision making as a model 7. Visual perception as Bayesian decision making.
PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2004 Measurement Theory.
PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Normative Decision Theory A prescriptive theory for how decisions should be made.
PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Decision Theory II.
Computational Vision Jitendra Malik University of California at Berkeley Jitendra Malik University of California at Berkeley.
1er. Escuela Red ProTIC - Tandil, de Abril, Bayesian Learning 5.1 Introduction –Bayesian learning algorithms calculate explicit probabilities.
Baysian Approaches Kun Guo, PhD Reader in Cognitive Neuroscience School of Psychology University of Lincoln Quantitative Methods 2011.
Today Logistic Regression Decision Trees Redux Graphical Models
Vision in Man and Machine. STATS 19 SEM Talk 2. Alan L. Yuille. UCLA. Dept. Statistics and Psychology.
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
Online Learning Algorithms
Statistics Continued. Purpose of Inferential Statistics Try to reach conclusions that extend beyond the immediate data Make judgments about whether an.
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
Exercise Session 10 – Image Categorization
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
A Brief Introduction to Graphical Models
Lecture 2: Bayesian Decision Theory 1. Diagram and formulation
CS 8751 ML & KDDSupport Vector Machines1 Support Vector Machines (SVMs) Learning mechanism based on linear programming Chooses a separating plane based.
Perception Introduction Pattern Recognition Image Formation
Visibility Graph. Voronoi Diagram Control is easy: stay equidistant away from closest obstacles.
A Comparison Between Bayesian Networks and Generalized Linear Models in the Indoor/Outdoor Scene Classification Problem.
Perceptual and Sensory Augmented Computing Machine Learning, Summer’11 Machine Learning – Lecture 13 Introduction to Graphical Models Bastian.
Graphical Models in Vision. Alan L. Yuille. UCLA. Dept. Statistics.
Supervised Learning of Edges and Object Boundaries Piotr Dollár Zhuowen Tu Serge Belongie.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Texture Key issue: representing texture –Texture based matching little is known –Texture segmentation key issue: representing texture –Texture synthesis.
Vision-based human motion analysis: An overview Computer Vision and Image Understanding(2007)
Learning to Perceive Transparency from the Statistics of Natural Scenes Anat Levin School of Computer Science and Engineering The Hebrew University of.
Optimal Bayes Classification
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
Slides for “Data Mining” by I. H. Witten and E. Frank.
INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN © The MIT Press, Lecture.
BCS547 Neural Decoding.
Uncertainty ECE457 Applied Artificial Intelligence Spring 2007 Lecture #8.
Robotics Club: 5:30 this evening
Quiz 3: Mean: 9.2 Median: 9.75 Go over problem 1.
Ch.9 Bayesian Models of Sensory Cue Integration (Mon) Summarized and Presented by J.W. Ha 1.
Machine Learning – Lecture 11
Review of statistical modeling and probability theory Alan Moses ML4bio.
CHAPTER 3: BAYESIAN DECISION THEORY. Making Decision Under Uncertainty Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
Representation in Vision Derek Hoiem CS 598, Spring 2009 Jan 22, 2009.
Autonomous Mobile Robots Autonomous Systems Lab Zürich Probabilistic Map Based Localization "Position" Global Map PerceptionMotion Control Cognition Real.
Bayesian Perception.
From local motion estimates to global ones - physiology:
Lecture 1.31 Criteria for optimal reception of radio signals.
Instructor: S. Narasimhan
Maximum Expected Utility
Qian Liu CSE spring University of Pennsylvania
Quick Review Probability Theory
Quick Review Probability Theory
Artificial Intelligence
Data Mining Lecture 11.
CS 4/527: Artificial Intelligence
CAP 5636 – Advanced Artificial Intelligence
CS 188: Artificial Intelligence
Probabilistic Map Based Localization
Dependencies in Structures of Decision Tables
Machine Learning: Lecture 6
Machine Learning: UNIT-3 CHAPTER-1
Chapter 14 February 26, 2004.
Presentation transcript:

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Making Decisions: Modeling perceptual decisions

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Outline Graphical models for belief example completed Discrete, continuous and mixed belief distributions Vision as an inference problem Modeling Perceptual beliefs: Priors and likelihoods Modeling Perceptual decisions: Actions, outcomes, and utility spaces for perceptual decisions

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Graphical Models: Modeling complex inferences A BC Arrows represent conditioning Nodes store conditional Probability Tables This model represents the decomposition: P(A,B,C) = P(B|A) P(C|A) P(A) What would the diagram for P(B|A,C) P(A|C) P( C) look like?

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Sprinkler Problem You see wet grass. Did it rain?

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Sprinkler Problem cont’d 4 variables: C = Cloudy, R=rain; S=sprinkler; W=wet grass What’s the probability of rain? Need P(R|W) Given: P( C) = [ ]; P(S|C) ; P(R|C); P(W|R,S) Do the math: 1)Get the joint: P(R,W,S,C) = P(W|R,S) P(R|C) P(S|C) P( C) 2)Marginalize: P(R,W) =  S  C P(R,W,S,C) 3)Divide: P(R|W) = P(R,W)/P(W) = P(R,W)/  R P(R,W)

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Continuous data, discrete beliefs

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Continuous data, Multi-class Beliefs

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Width x y z position

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 The problem of perception Scene Inference Classify Identify Learn/Update Plan Action Control Action Decisions/ActionsImage Measurement Color Motion Edges

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Vision as Statistical Inference What Moves, Square or Light Source? Kersten et al, 1996 Rigid Rotation Or Shape Change? Weiss & Adelson, 2000 ? ?

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Modeling Perceptual Beliefs Observations O in the form of image measurements- –Cone and rod outputs –Luminance edges –Texture –Shading gradients –Etc World states s include –Intrinsic attributes of objects (reflectance, shape, material) –Relations between objects (position, orientation, configuration) Belief equation for perception

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Image data y x O

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Possible world states consistent with O Many 3D shapes can map to the same 2D image x z y s

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Forward models for perception: Built in knowledge of image formation x z y Likelihood: p( I | scene) e.g. for no rendering noise p( I | scene) =    f(scene) ) Images are produced by physical processes that can be modeled via a rendering equation: Modeling rendering probabilistically:

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 x y z x z y p(s)=p is biggest Priors weight likelihood selections Select most probable Adapted from a figure by Sinha & Adelson Prior further narrows selection

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Graphical Models for Perceptual Inference Q O1O1 ONON O2O2 L V Object Description Variables All of these nodes can be subdivided into individual Variables for more compact representation of the probability distribution Lighting Variables Viewing variables Priors Likelihoods

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 There is an ideal world… Matching environment to assumptions –Prior on light source direction

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Prior on light source motion

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Combining Likelihood over different observations Multiple data sources, sometimes contradictory. What data is relevant? How to combine?

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Modeling Perceptual Decisions One possible strategy for human perception is to try to get the right answer. How do we model this with decision theory? Penalize responses with an error Utility function : Given sensory data O= { o 1, o 2, …, o n } and some world property S perception is trying to infer, compute the decision: Where: Value of response Response is best value

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 Actions: a = 1 Choose rigid a = 0 Choose not rigid Rigidity judgment Let s is a random variable representing rigidity s=1 it is rigid s=0 not rigid Rigid Rotation Or Shape Change? Let O 1, O 2, …, O n be descriptions of the ellipse for frames 1,2, …, n Need likelihood P( O 1, O 2, …, O n | s ) Need Prior P( s ) U(s,a ) a=0a=1 s = s = Utility

PSY 5018H: Math Models Hum Behavior, Prof. Paul Schrater, Spring 2005 “You must choose, but Choose Wisely” Given previous formulation, can we minimize the number of errors we make? Given: responses a i, categories s i, current category s, data o To Minimize error: –Decide a i if P(a i | o) > P(a k | o) for all i≠k P( o | s i ) P(s i ) > P(o | s k ) P(s k ) P( o | s i )/ P(o | s k ) > P(s k ) / P(s i ) P( o | s i )/ P(o | s k ) > T Optimal classifications always involve hard boundaries