Presentation is loading. Please wait.

Presentation is loading. Please wait.

Decision making as a model 5.a. more models and measures b. costs and benefits c. the optimal criterion (Bayes is back)

Similar presentations


Presentation on theme: "Decision making as a model 5.a. more models and measures b. costs and benefits c. the optimal criterion (Bayes is back)"— Presentation transcript:

1 Decision making as a model 5.a. more models and measures b. costs and benefits c. the optimal criterion (Bayes is back)

2 Unequal variances:

3

4 Same sensitivity? (model uneqal variances) AB Different d'-s (model equal variances)

5 PHPH P FA zHzH z FA -μs-μs μs/σsμs/σs P FA = Φ(-λ), z FA = -λ Unequal variance model σ n =1, σ s tg(θ) = 1/σ s tg(θ) = 1/σ s θ z H = ---- + --- z FA μ s σ s σ s μ s – λ μ s – λ P H = Φ z H = σ s σ s 0 λ μ s

6 zHzH z FA ΔmΔmΔmΔm e a Δm does not distinguish between large and small σ s distance to origin analogous to d' : O Measures: d e = Oe√2 d a = Oa√2 μ s √1 + σ s 2 Z H = -Z FA (Pythagoras and similar triangles)

7 To get A z, the surface under the ROC-curve according to the Gaussian model with unequal variances: Produce a formula for the proportion correct 2AFC-experiment under that model: P CZ According to the area theorem P C equals A so P CZ equals A z

8 PHPHPHPH Area under Gaussian ROC-curve: A z P FA Gaussian 2AFC: P C = p(x s >x n ) = p(x s -x n >0) n s

9 P CZ = p(x s >x n ) = p(x s - x n >0) -μ s =1 - Φ √1 + σ s 2 -μs-μs = A z according to area theorem! The variance of the difference of two independent random variables is the sum of both variances μ s = Φ √1 + σ s 2

10 PHPHPHPH Area under Gaussian ROC- curve: A z P FA A z = Φ(d a /√2) Equal variances: A z = A d' = Φ(d′ /√2) (already shown) tg √1/σ s 2 + 1 μ s /σ s = Φ √1/σ s 2 + 1 μ s = Φ √1 + σ s 2

11 zHzH z FA -μs-μs μs/σsμs/σs P FA = Φ(-λ), z FA = -λ tg(θ) = 1/σ s tg(θ) = 1/σ s θ μ s – λ μ s – λ P H = Φ z H = σ s σ s λ β with unequal variances: h / f = f s (λ) / f n (λ) h f 1 -λ 2 /2 1 -z FA 2 /2 f n (λ) = --------- e = --------- e √(2π) √ (2π) 1 -(λ – μ s ) 2 /2σ 2 1 - z H 2 /2 f s (λ) = ------------ e = ------------ e σ s √(2π) σ s √ (2π) 1 ( z FA 2 – z H 2 )/2 Divide: ------------------------- = --- e σ s So: β unequal = β equal / σ s (from slide 5)

12 A A' A z d a d e A d' d' S LR c B'' S β β c Sensitivity Criterium/bias General. Rough Gaussian many pts (one pt) σ n ≠ σ s σ n = σ s Survey of signal detection measures With these measures the sensitivity and the criteria of humans, machines and systems can be expressed independently

13 What are the costs of missing a weapon/explosive at an airport? What are the costs of a false alarm? What are the costs and benefits of baggage screening?

14 Costs and benefits: Pay-off matrix C Miss V Hit V CR C FA “no” “yes” S(+N) N NB. C is a positive number: “a false alarm will cost you € 5” EV = p(Hit)V hit - p(Miss)C Miss + p(CR)V CR - p(FA)C FA = p(s){P H V Hit – (1-P H )C Miss } + p(n){(1-P FA )V CR - P FA C FA } Compare with doing nothing: EV = p(n)V CR – p(s)C Miss NB.: no free lunch, no free screening! NB.: P H ∙p(s)! - C scr

15 An optimal decison in uncertainty: Set criterion at the value of x (x c ) at which expected value/utility of “Yes” equals expected value/utility of “No” EV(Yes|x c ) = EV(No|x c ) x xcxc

16 V Hit p(Hit) – C FA p(FA) = V CR p(CR) - C Miss p(Miss) “cost”: C FA positive! V Hit p(signal|x c ) – C FA p(noise|x c ) = V CR p(noise|x c ) - C Miss p(signal|x c ) p(signal|x c ) V CR + C FA ---------------- = --------------- p(noise|x c ) V Hit + C Miss But do we know that one? V Hit p(signal|x c ) + C Miss p(signal|x c ) = V CR p(noise|x c ) + C FA p(noise|x c ) p(signal|x c )(V Hit + C Miss ) = p(noise|x c )(V CR + C FA )

17 p(signal|x c ) V CR + C FA ---------------- = --------------- p(noise|x c ) V Hit + C Miss We want this one We know (in principle): p(x|noise) p(x|signal) required: a way to get from p(A|B) to p(B|A) Bayes’ Rule!

18 p(A|B) p(B|A) p(A) --------- =---------- ------- p(A|¬B) p(B|¬A) p(¬A) (odds form) Applied to signal detection: p(signal|x c ) --------------- p(noise|x c ) p(x c |signal) p(signal) p(x c |noise) p(noise) =

19 p(signal|x c ) V CR + C FA ---------------- = --------------- p(noise|x c ) V Hit + C Miss -------------- --------- --- = --------------- p(x c |signal) p(signal) V CR + C FA -------------- --------- --- = --------------- p(x c |noise) p(noise) V Hit + C Miss - -- = ------------- ------------- p(x c |signal) p(noise) V CR +C FA - --------------- = ------------- ------------- p(x c |noise) p(signal) V Hit +C Miss Bayes LR c prior odds payoff matrix S, β

20 So: an ideal observator, knowing prior odds and pay-off matrix, can compute an optimal criterion. People are not that good at arithmetic, but adapt reasonably well to pay-off matrix and prior odds - -- = ------------- ------------- p(x c |signal) p(noise) V CR +C FA - --------------- = ------------- ------------- p(x c |noise) p(signal) V Hit +C Miss

21 Nog meer weten over signaaldetectie? Wickens, T. D.(2002). Elementary signal detection theory. Oxford University Press. Macmillan, N. A. & Creelman C. D.(2005). Detection Theory: A user’s guide, 2 nd ed. New York: Lawrence Erlbaum Associates


Download ppt "Decision making as a model 5.a. more models and measures b. costs and benefits c. the optimal criterion (Bayes is back)"

Similar presentations


Ads by Google