Decision making as a model 5.a. more models and measures b. costs and benefits c. the optimal criterion (Bayes is back)

Slides:



Advertisements
Similar presentations
Sensitivity versus response bias: Psychophysical techniques.
Advertisements

Signal Detection Theory. The classical psychophysicists believed in fixed thresholds Ideally, one would obtain a step-like change from no detection to.
© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/ Other Classification Techniques 1.Nearest Neighbor Classifiers 2.Support Vector Machines.
10 / 31 Outline Perception workshop groups Signal detection theory Scheduling meetings.
Bayes rule P[s(t)|x(t)] = P[x(t)|s(t)] P(s(t)] / P[x(t)] s(t) - incoming stimulus that describes the object x(t) – activity in the system resulting from.
Pattern Classification, Chapter 2 (Part 2) 0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R.
Pattern Classification Chapter 2 (Part 2)0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O.
Decision making as a model 3. Heavy stuff: derivation of two important theorems.
Decision making as a model 4. Signal detection: models and measures.
Decision making as a model 2. Statistics and decision making.
Chapter 4 Probability.
0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Introduction to Biomedical Statistics. Signal Detection Theory What do we actually “detect” when we say we’ve detected something?
E4004 Survey Computations A
Chapter 4 Probability Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Short Notes on Theory of Signal Detection Walter Schneider
Psychophysics 3 Research Methods Fall 2010 Tamás Bőhm.
6 Probability Chapter6 p Operations on events and probability An event is the basic element to which probability can be applied. Notations Event:
Principles of Pattern Recognition
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 13 Oct 14, 2005 Nanjing University of Science & Technology.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
REVISED CONTEXTUAL LRT FOR VOICE ACTIVITY DETECTION Javier Ram’ırez, Jos’e C. Segura and J.M. G’orriz Dept. of Signal Theory Networking and Communications.
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
The Method of Constant Stimuli & Signal Detection Theory VISN2211 Sieu Khuu David Lewis.
Random Experiment Random Variable: Continuous, Discrete Sample Space: S Event: A, B, E Null Event Complement of an Event A’ Union of Events (either, or)
Lecture notes for Stat 231: Pattern Recognition and Machine Learning 3. Bayes Decision Theory: Part II. Prof. A.L. Yuille Stat 231. Fall 2004.
Signal detection theory Appendix Takashi Yamauchi Texas A&M University.
Chapter 4 Probability ©. Sample Space sample space.S The possible outcomes of a random experiment are called the basic outcomes, and the set of all basic.
Signal Detection Theory I. Challenges in Measuring Perception II. Introduction to Signal Detection Theory III. Applications of Signal Detection Theory.
Chapter 4: Pattern Recognition. Classification is a process that assigns a label to an object according to some representation of the object’s properties.
Bayesian Decision Theory (Classification) 主講人:虞台文.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
BCS547 Neural Decoding. Population Code Tuning CurvesPattern of activity (r) Direction (deg) Activity
Class 2 Probability Theory Discrete Random Variables Expectations.
BCS547 Neural Decoding.
12/7/20151 Math b Conditional Probability, Independency, Bayes Theorem.
2. Introduction to Probability. What is a Probability?
Chapter 2: Signal Detection and Absolute Judgement
Covariance matrices for all of the classes are identical, But covariance matrices are arbitrary.
Bayesian Decision Theory Basic Concepts Discriminant Functions The Normal Density ROC Curves.
ISE Recall the HIP model. ISE Beyond sensing & perceiving …  You are sitting at lunch and hear a familiar ring tone. Is that your.
Comparison of Fuzzy and Signal Detection Theory L.L. Murphy, J.L. Szalma, and P.A. Hancock Department of Psychology Institute of Simulation and Training.
USER INTERFACE USER INTERFACE January 12, 2006 Intern 박지현 Performance analysis of filtering software using Signal Detection Theory Ashutosh Deshmukh, Balaji.
Outline of Lecture I.Intro to Signal Detection Theory (words) II.Intro to Signal Detection Theory (pictures) III.Applications of Signal Detection Theory.
STATISTICS 6.0 Conditional Probabilities “Conditional Probabilities”
GRAPPLING WITH DATA Variability in observations Sources of variability measurement error and reliability Visualizing the sample data Frequency distributions.
Psy Psychology of Hearing Psychophysics and Detection Theory Neal Viemeister
Szalma & Hancock HFES Europe, Fuzzy Signal Detection Theory and Human Performance: A Review of Empirical Evidence for Model Validity J.L. Szalma.
SIGNAL DETECTION THEORY  A situation is described in terms of two states of the world: a signal is present ("Signal") a signal is absent ("Noise")  You.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
General approach: A: action S: pose O: observation Position at time t depends on position previous position and action, and current observation.
Fuzzy Signal Detection Theory: ROC Analysis of Stimulus and Response Range Effects J.L. Szalma and P.A. Hancock Department of Psychology and Institute.
Lecture 1.31 Criteria for optimal reception of radio signals.
Probability Theory and Parameter Estimation I
Chapter 4 Probability.
Quick Review Probability Theory
Origins of Signal Detection Theory
2. Introduction to Probability
Swets et al (1961).
Information Units of Measurement
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
spike-triggering stimulus features
How do we make decisions about uncertain events?
Bellwork 8/18 Do even on page 7.
Signal detection theory
Pattern Recognition and Machine Learning Chapter 2: Probability Distributions July chonbuk national university.
basic probability and bayes' rule
Presentation transcript:

Decision making as a model 5.a. more models and measures b. costs and benefits c. the optimal criterion (Bayes is back)

Unequal variances:

Same sensitivity? (model uneqal variances) AB Different d'-s (model equal variances)

PHPH P FA zHzH z FA -μs-μs μs/σsμs/σs P FA = Φ(-λ), z FA = -λ Unequal variance model σ n =1, σ s tg(θ) = 1/σ s tg(θ) = 1/σ s θ z H = z FA μ s σ s σ s μ s – λ μ s – λ P H = Φ z H = σ s σ s 0 λ μ s

zHzH z FA ΔmΔmΔmΔm e a Δm does not distinguish between large and small σ s distance to origin analogous to d' : O Measures: d e = Oe√2 d a = Oa√2 μ s √1 + σ s 2 Z H = -Z FA (Pythagoras and similar triangles)

To get A z, the surface under the ROC-curve according to the Gaussian model with unequal variances: Produce a formula for the proportion correct 2AFC-experiment under that model: P CZ According to the area theorem P C equals A so P CZ equals A z

PHPHPHPH Area under Gaussian ROC-curve: A z P FA Gaussian 2AFC: P C = p(x s >x n ) = p(x s -x n >0) n s

P CZ = p(x s >x n ) = p(x s - x n >0) -μ s =1 - Φ √1 + σ s 2 -μs-μs = A z according to area theorem! The variance of the difference of two independent random variables is the sum of both variances μ s = Φ √1 + σ s 2

PHPHPHPH Area under Gaussian ROC- curve: A z P FA A z = Φ(d a /√2) Equal variances: A z = A d' = Φ(d′ /√2) (already shown) tg √1/σ s μ s /σ s = Φ √1/σ s μ s = Φ √1 + σ s 2

zHzH z FA -μs-μs μs/σsμs/σs P FA = Φ(-λ), z FA = -λ tg(θ) = 1/σ s tg(θ) = 1/σ s θ μ s – λ μ s – λ P H = Φ z H = σ s σ s λ β with unequal variances: h / f = f s (λ) / f n (λ) h f 1 -λ 2 /2 1 -z FA 2 /2 f n (λ) = e = e √(2π) √ (2π) 1 -(λ – μ s ) 2 /2σ z H 2 /2 f s (λ) = e = e σ s √(2π) σ s √ (2π) 1 ( z FA 2 – z H 2 )/2 Divide: = --- e σ s So: β unequal = β equal / σ s (from slide 5)

A A' A z d a d e A d' d' S LR c B'' S β β c Sensitivity Criterium/bias General. Rough Gaussian many pts (one pt) σ n ≠ σ s σ n = σ s Survey of signal detection measures With these measures the sensitivity and the criteria of humans, machines and systems can be expressed independently

What are the costs of missing a weapon/explosive at an airport? What are the costs of a false alarm? What are the costs and benefits of baggage screening?

Costs and benefits: Pay-off matrix C Miss V Hit V CR C FA “no” “yes” S(+N) N NB. C is a positive number: “a false alarm will cost you € 5” EV = p(Hit)V hit - p(Miss)C Miss + p(CR)V CR - p(FA)C FA = p(s){P H V Hit – (1-P H )C Miss } + p(n){(1-P FA )V CR - P FA C FA } Compare with doing nothing: EV = p(n)V CR – p(s)C Miss NB.: no free lunch, no free screening! NB.: P H ∙p(s)! - C scr

An optimal decison in uncertainty: Set criterion at the value of x (x c ) at which expected value/utility of “Yes” equals expected value/utility of “No” EV(Yes|x c ) = EV(No|x c ) x xcxc

V Hit p(Hit) – C FA p(FA) = V CR p(CR) - C Miss p(Miss) “cost”: C FA positive! V Hit p(signal|x c ) – C FA p(noise|x c ) = V CR p(noise|x c ) - C Miss p(signal|x c ) p(signal|x c ) V CR + C FA = p(noise|x c ) V Hit + C Miss But do we know that one? V Hit p(signal|x c ) + C Miss p(signal|x c ) = V CR p(noise|x c ) + C FA p(noise|x c ) p(signal|x c )(V Hit + C Miss ) = p(noise|x c )(V CR + C FA )

p(signal|x c ) V CR + C FA = p(noise|x c ) V Hit + C Miss We want this one We know (in principle): p(x|noise) p(x|signal) required: a way to get from p(A|B) to p(B|A) Bayes’ Rule!

p(A|B) p(B|A) p(A) = p(A|¬B) p(B|¬A) p(¬A) (odds form) Applied to signal detection: p(signal|x c ) p(noise|x c ) p(x c |signal) p(signal) p(x c |noise) p(noise) =

p(signal|x c ) V CR + C FA = p(noise|x c ) V Hit + C Miss = p(x c |signal) p(signal) V CR + C FA = p(x c |noise) p(noise) V Hit + C Miss - -- = p(x c |signal) p(noise) V CR +C FA = p(x c |noise) p(signal) V Hit +C Miss Bayes LR c prior odds payoff matrix S, β

So: an ideal observator, knowing prior odds and pay-off matrix, can compute an optimal criterion. People are not that good at arithmetic, but adapt reasonably well to pay-off matrix and prior odds - -- = p(x c |signal) p(noise) V CR +C FA = p(x c |noise) p(signal) V Hit +C Miss

Nog meer weten over signaaldetectie? Wickens, T. D.(2002). Elementary signal detection theory. Oxford University Press. Macmillan, N. A. & Creelman C. D.(2005). Detection Theory: A user’s guide, 2 nd ed. New York: Lawrence Erlbaum Associates