Download presentation

Presentation is loading. Please wait.

Published byAinsley Noble Modified over 2 years ago

1
BAYES RULES! -in finite models (ECCAI2000) -but not in infinite! (MaxEnt2000) Stefan Arnborg, KTH Gunnar Sjödin, SICS

2
Normative claim of Bayesianism 4 EVERY type of uncertainty should be treated as probability 4 Aristotle, Sun Zi(300BC), Bayes(1763), Laplace, von Clausewits, de Finetti, Jeffreys, Keynes, Ramsey, Oxenstierna, Adams, Lindley, Cheeseman, Jaynes,… 4 This claim is controversial and not universally accepted: Fisher(1922), Cramér, Zadeh, Dempster, Shafer, Smets, Walley(1999) …

3
Fundamental Justifications 4 Consistent Betting Paradigm, Coherence: de Finetti, Savage(1950), Lindley(1982), … Snow(1999) 4 Information Based: Cox(1946), Aczél(1966), Jaynes(1994) Criticized by Paris(1994), Halpern(1999)

4
Main Result: 4 Cox information based justification can be derived with weak common sense assumptions. Difference between finite and infinite models. 4 Assumptions are: Refinability Information independence Strict monotonicity Infinite case: Model is closed or closable

5
Jaynes’s Desiderata on Uncertainty Management 4 Uncertainty is measured by real number, dependent on information subject possesses: A|C : plausibility of A given C. 4 Consistency. 4 Common sense.

6
Real Numbered uncertainties 4 Given set of statements (possible world sets) A, B, C, … 4 Plausibility A|C: plausibility of A given that C is known to be true - a real number 4 Conjunction: AB 4 Disjunction A+B, Difference A-B 4 AB|C=F(A|BC, B|C) 4 A+B|C=G(A|C,B-A|C) 4 not A|C=S(A|C)

7
RESCALABILITY THEOREMS 4 Under suitable assumptions there is a strictly monotone function w(x) such that 4 w(F(x,y))=w(x)w(y) 4 w(G(x,y))=w(x)+w(y) 4 I.E., by rescaling the plausibility measure by w, model becomes a probability model 4 I.E., if you accept the assumptions, then Bayes Rules!

8
Invariance under rescaling 4 * and + are strictly monotone, symmetric, associative and jointly distributive 4 These properties are invariant under strictly monotone rescaling 4 If F and G violate the properties, rescaling is impossible.

9
Consistency 4 AB|C==BA|C, thus F(A|BC,B|C)=F(B|AC,A|C) 4 A+B|C==B+A|C 4 (AB)C|D==A(BC)|D 4 (A+B)C|D==AC+BC|D 4 Does this mean that F,G must be associative, symmetric and jointly distributive??? 4 No, but with our assumptions these laws are inherited from corresponding laws of propositional logic!

10
Previous common sense assumpti 4 F and G are strictly increasing 4 F and G(S) are twice continuously differentiable on (0,1) and associative (Cox 1946) 4 F and G associative, continuous (Aczél,1966) 4 Complex condition (Paris 1994) 4 ’Counterexample’ by Halpern(1999)

11
OUR common sense assumptions 4 REFINABILITY: Assume B’|B=c was defined; It is now possible to refine another event A by A’ so that A’|A=c (cf Tribus, Jimison, Heckerman) 4 INFORMATION INDEPENDENCE: New events obtained by refinement of same event can be postulated independent: A|BC=A|C and B|AC=B|C ’Knowledge of one has no effect on plausibility of the other’

12
Halpern’s Example: 4 Worlds A B C D E G H I J K L M D|E=H|J B|C = L|M A|C = I|JE|G = A|B H|J≈K|M D|G = K|LM

13
Example: F(F(x,y),z)≈F(x,F(y,z)) C D E G H I J K L M D|E=H|J=x B|C = L|M=z A|C = I|JE|G = A|B=y H|J≈K|M D|G = K|LM

14
Refine:A’|A=D|E: INCONSISTE C D E G H I J K L M D|E=H|J=x B|C = L|M=z A|C = I|JE|G = A|B=y H|J≈K|M D|G = K|LM A’ H|J=A’AB|C=K|M !!!!!!!!!!!!!

15
OBSERVATION 1 4 The functions F and G must be symmetric and associative if refinability and information independence accepted 4 F, G must likewise be jointly distributive F(G(x,y),z)=G(F(x,z),F(y,z)) 4 But only on the finite domain of definition 4 Not enough for rescalability

16
Rescalability is solvability of LP L4+L4-La=0 L3+L5-La=0 L2+L4-Lb=0 L1+L5-Lb=0 L4+L6-Lc=0 L3+L7-Lc=0 L2+L6-Ld=0 L1+L8-Ld=0 L1

17
Proof structure: Rescalability=Consistnt Refinability 4 (i)->(ii): rescaling on discrete set can be interpolated smoothly over (0,1). 4 (ii)->(i) is trickier: assume that rescalability is impossible and show that existence of an inconsistent refinement follows. Find L such that ML=0 and DL>0

18
Duality theory argument 4 If no point satisfies Mf=0 and Df>0, then a dual system has a solution d´. This solution is non- negative and normal to D(null space of Mf). 4 Null space of d’D contains null space of Mf. Thus c’M=d’D for some integer vector c. 4 The vector c yields inconsistent refinement.

19
Duality explained If L such that ML=0 then not DL>0 F= {L:ML=0} DFDF DF has non-neg normal! d d1L1+…+d(n-1)L(n-1)= d1L2+…+d(n-1)Ln translates to F(a1,..,ak,c1,…,cm)=F(b1,…,bk,c1,…cm) with ai

20
Inconsistency of Example: F(x4,x4)=F(x3,x5)=a +1 F(x2,x4)=F(x1,x5)=b -1 F(x4,x6)=F(x3,x7)=c -1 F(x2,x6)=F(x1,x8)=d +1 F(x7,q)=F(x8,q), where c Linear system turns out non-solvable; from dual solution we obtain c: q=F(x1,F(x2,F(x3,F(x4,F(x4,F(x5,x6)))))) Composing equations as indicated by c yields an inconsistency: This corresponds to an inconsistent refinement consisting of 9 information-independent new cases with plausibilties x1, x2, x3, x4, x4,…,x8 relative to an existing event

21
Joint rescalability of F and G 4 Duality argument works also for G to + 4 We have observed the law of joint distributivity: F(G(x,y),z)=G(F(x,z),F(y,z)) 4 Scaling G to + transforms this to F(x+y,z)=F(x,z)+F(y,z)--Cauchy’s equation

22
Cauchy’s Equation f(x+y)=f(x)+f(y) 4 Classical argument shows every bounded monotone solution must be linear: f(x)=kx 4 Slight modification for changing denseness assumption to refinability 4 Thus, F(x,z)=xc(z); with F(1,z)=z yields F(x,z)=xz !

23
Summary, finite case: 4 Rescalability follows for finite models from weak common sense assumptions: strict monotonicity, refinability and information independence 4 Conjecture: Savages and Lindleys consistent betting behavior analyses can be similarly strengthened

24
Probability model Counterexample Log probability i INFINITE CASE: NON-SEPARABILITY

25
Extended Probability 4 Probability values are taken from a real ordered field. 4 A real ordered field is generated by rationals, reals and infinitesimals Example: x->0.5, y->0.5+ . 4 Extended probability has been shown equivalent to non-monotonic reasoning schemes (Benferhat, Dubois, Prade, 1997).

26
Theorems, Infinite Models: 4 Every acceptable ordered plausibility model is equivalent to extended probability. 4 Every ordered closed plausibility model that can be embedded in the reals is equivalent to standard probability.

27
SUMMARY 4 With assumptions of refinability, independence and strict monotonicity, finite ordered plausibility models are equivalent to probabilities 4 With further assumption of closability, infinite ordered plausibility models are equivalent to extended probabilities 4 And real closed plausibility models are equivalent to probabilities.

28
OPEN PROBLEMS 4 Is it possible to obtain both coherence and consistency, asymptotically for infinite- dimensional systems? (Robins, Wasserman) 4 Partially ordered domain of plausibilities equivalent to Bayesian multiple contexts? 4 Relation Random Set theory -- DS theory?

Similar presentations

OK

Warm Up. Warm Up Answers Theorem and Proof A theorem is a statement or conjecture that has been shown to be true. A theorem is a statement or conjecture.

Warm Up. Warm Up Answers Theorem and Proof A theorem is a statement or conjecture that has been shown to be true. A theorem is a statement or conjecture.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on group discussion topics Ppt on traditional methods of water harvesting Hrm ppt on recruitment and selection Download ppt on the poem the road not taken Ppt on production management information system Ppt on rivers of india in hindi Ppt on paintings and photographs related to colonial period years Download ppt on obesity Download ppt on transportation in human beings the largest Ppt on meeting etiquettes pour