Dutch books and epistemic events Jan-Willem Romeijn Psychological Methods University of Amsterdam ILLC 2005 Interfacing Probabilistic and Epistemic Update.

Slides:



Advertisements
Similar presentations
Reasons for (prior) belief in Bayesian epistemology
Advertisements

Bayes rule, priors and maximum a posteriori
Wellcome Trust Centre for Neuroimaging
FT228/4 Knowledge Based Decision Support Systems
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Bayesian models for fMRI data
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
Bayesian Inference Chris Mathys Wellcome Trust Centre for Neuroimaging UCL SPM Course London, May 12, 2014 Thanks to Jean Daunizeau and Jérémie Mattout.
Overview Fundamentals
Bayesian inference “Very much lies in the posterior distribution” Bayesian definition of sufficiency: A statistic T (x 1, …, x n ) is sufficient for 
Week 21 Basic Set Theory A set is a collection of elements. Use capital letters, A, B, C to denotes sets and small letters a 1, a 2, … to denote the elements.
Reasoning Lindsay Anderson. The Papers “The probabilistic approach to human reasoning”- Oaksford, M., & Chater, N. “Two kinds of Reasoning” – Rips, L.
SUPPLEMENTARY NOTES Open-Mindedness and related concepts.
Writing the report.
Statistical Learning: Bayesian and ML COMP155 Sections May 2, 2007.
Learning with Bayesian Networks David Heckerman Presented by Colin Rickert.
A/Prof Geraint Lewis A/Prof Peter Tuthill
1 Empirical Similarity and Objective Probabilities Joint works of subsets of A. Billot, G. Gayer, I. Gilboa, O. Lieberman, A. Postlewaite, D. Samet, D.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
I The meaning of chance Axiomatization. E Plurbus Unum.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 15 Probability Rules!
Class notes for ISE 201 San Jose State University
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Section 4-2 Basic Concepts of Probability.
De Finetti’s ultimate failure Krzysztof Burdzy University of Washington.
Expected Value (Mean), Variance, Independence Transformations of Random Variables Last Time:
Artificial Intelligence Reasoning. Reasoning is the process of deriving logical conclusions from given facts. Durkin defines reasoning as ‘the process.
Sections 4-1 and 4-2 Review and Preview and Fundamentals.
Between proof and truth Gabriel Sandu Univ. of Helsinki.
Statistical Decision Theory
1 Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved. Basic Principle of Statistics: Rare Event Rule If, under a given assumption,
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved. Chapter 4 Probability 4-1 Review and Preview 4-2 Basic Concepts of Probability.
THE OSCAR PROJECT Rational Cognition in OSCAR John L. Pollock Department of Philosophy University of Arizona Tucson, Arizona 85721
Lecture 5a: Bayes’ Rule Class web site: DEA in Bioinformatics: Statistics Module Box 1Box 2Box 3.
Chapter 1 Probability Spaces 主講人 : 虞台文. Content Sample Spaces and Events Event Operations Probability Spaces Conditional Probabilities Independence of.
Basics of Probability. A Bit Math A Probability Space is a triple, where  is the sample space: a non-empty set of possible outcomes; F is an algebra.
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Week 15 - Wednesday.  What did we talk about last time?  Review first third of course.
Visibility Graph. Voronoi Diagram Control is easy: stay equidistant away from closest obstacles.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
Epistemic Strategies and Games on Concurrent Processes Prakash Panangaden: Oxford University (on leave from McGill University). Joint work with Sophia.
Computing & Information Sciences Kansas State University Wednesday, 22 Oct 2008CIS 530 / 730: Artificial Intelligence Lecture 22 of 42 Wednesday, 22 October.
Slide 15-1 Copyright © 2004 Pearson Education, Inc.
Week 11 What is Probability? Quantification of uncertainty. Mathematical model for things that occur randomly. Random – not haphazard, don’t know what.
Once again about the science-policy interface. Open risk management: overview QRAQRA.
Course files
Uncertainty Management in Rule-based Expert Systems
LAC group, 16/06/2011. So far...  Directed graphical models  Bayesian Networks Useful because both the structure and the parameters provide a natural.
Making sense of randomness
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
Abstract We offer a formal treatment of choice behaviour based on the premise that agents minimise the expected free energy of future outcomes. Crucially,
Once again about the science-policy interface. Open risk management: overview QRAQRA.
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved. Essentials of Business Statistics: Communicating with Numbers By Sanjiv Jaggia and.
International Conference on Fuzzy Systems and Knowledge Discovery, p.p ,July 2011.
Stat 1510: General Rules of Probability. Agenda 2  Independence and the Multiplication Rule  The General Addition Rule  Conditional Probability  The.
1 Chapter 15 Probability Rules. 2 Recall That… For any random phenomenon, each trial generates an outcome. An event is any set or collection of outcomes.
Epistemology (How do you know something?)  How do you know your science textbook is true?  How about your history textbook?  How about what your parents.
Cognitive Processes PSY 334 Chapter 10 – Reasoning.
- 1 - Outline Introduction to the Bayesian theory –Bayesian Probability –Bayes’ Rule –Bayesian Inference –Historical Note Coin trials example Bayes rule.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 15 Probability Rules!
Ariel Caticha on Information and Entropy July 8, 2007 (16)
Updating Probabilities Ariel Caticha and Adom Giffin Department of Physics University at Albany - SUNY MaxEnt 2006.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
AP Statistics From Randomness to Probability Chapter 14.
Probability and statistics - overview Introduction Basics of probability theory Events, probability, different types of probability Random variable, probability.
Biointelligence Laboratory, Seoul National University
Bayesian Estimation and Confidence Intervals
Chapter 4 Probability.
Subject : Artificial Intelligence
basic probability and bayes' rule
Presentation transcript:

Dutch books and epistemic events Jan-Willem Romeijn Psychological Methods University of Amsterdam ILLC 2005 Interfacing Probabilistic and Epistemic Update

- 2 - Outline  Updating by conditioning  Violations of conditioning  External shocks to the probability  Meaning shifts in epistemic updates  A Bayesian model of epistemic updates  No-representation theorem  Concluding remarks

- 3 -  Updating by conditioning probability assignment p conditioning on events If probability theory is seen as a logic, updating functions like a deductive inference rule. Updating by conditioning is a consistency constraint for incorporating new facts in a probability assignment. events A, B, C,... probabilistic conclusions p(  | ABC...)

- 4 - Muddy Venn diagrams Conditioning on the fact that A is like zooming in on the probability assignment p within the set of possible worlds A. p A  A A p(  | A) Probability is represented by the size of rectangulars. Apart from normalising the probability of A, no changes are induced by the update operation. 

- 5 -  Violating conditioning Bayesian conditioning is violated if, in the course of the update, we also change the probabilities within A.  B The updated probability is p A (B) < p(B|A). This difference makes vulnerable for a Dutch book. B B  B B p(  | A ) A p A (  )

- 6 - Rational violations? In particular cases, violations of conditioning may seem rational.  Violations of the likelihood principle in classical statistics, model selection problems.  Epistemic updates: incorporating facts about knowledge states. Can we make sense of such violations from within a Bayesian perspective? 

- 7 - Possible resolution Violations are understandable if they result from changes in meaning. On learning A we may reinterpret B as B'. Can we represent such a meaning shift as Bayesian update, saying that we actually learned A' ?  p ( B' | A )p( B | A' )  B' B'  B B B p( B | A ) ?

- 8 -  Probability shocks Violations of conditioning can be understood as an external shock to the probability assignment p.  B B B p= 1 / 4 p'= 3 / 8 p'= 1 / 8 The events are associated with the same possible worlds, denoted, but these worlds are assigned probabilities p', according to a new constraint . AA

- 9 - Restricting the shock External shocks to the probability assignment may be governed by further formal criteria, such as minimal distance  between p and p' . pp'    Such criteria may be conservative, but they are not consistent. 

Choosing premises From a logical point of view, the update procedure comes down to choosing new premises. This is the extra-logical domain of objective Bayesianism: formally constrained prior probabilities. premise p events A, B, C,... conclusion p(  | ABC...) premise p' events A, B, C,... conclusion p'(  | ABC...) 

 Meaning shifts The update operation can also be seen as a change to the semantics: p (B' | A) < p (B | A).  B' B'  B B p= 1 / 4 AA The probabilities of possible worlds remain the same, but the update induces an implicit change of the facts involved.

Epistemic updates Consider two research groups, 1 and 2, that try to discover which of A, B, or C holds: AB p= 1 / 3 D1D1 The groups use different methods, delivering doubt or certainty in differing sets of possible worlds. C p= 1 / 3 D2D2  D 1  D 2 

Conditional probability According to the standard definition of conditional probability, we have p( D 2 | D 1 ) = 1 / 2 : But is this also the appropriate updated probability? AB p= 1 / 2 AB p= 1 / 3 C  D1D1 D2D2  D 1  D 2 D2D2 D1D1

Updated probability It seems that after an update with D 1, the second research group has very little to doubt about: Updating induces a meaning shift D 2  D' 2, and the correct updated probability is p ( D' 2 | D 1 ) = 0. AB p= 1 / 2 AB  D2D2  D 2 D1D1  D' 2 D1D1

 Epistemic events The meaning shift D 2  D' 2 can be understood by including epistemic states into the semantics. The diagram shows the accessible epistemic states in the world-state B. ABC A B C 2  1  B AB D1D1 C D2D2 

External states After learning that D 1, we may exclude world-state C from the state space. ABC A B C 2  1  A B C AB C A B C A B C 2  W   W 

Epistemic update But a full update also comprises conditioning on the accessible epistemic states of both research groups. ABC 1  A B A B C 2  ABC 1  A B A B C 2  This latter step brings about the event change D 2  D' 2. 

Bayesian conditioning There is no violation of conditioning in the example. ABC 1  A B A B C 2  It is simply unclear which event we are supposed to update with upon learning that group 1 is in doubt: D 1 or D' 1. AB C 1  A B C A B C 2  W  C D1D1 D' 1 W   ?

 Choosing semantics Many puzzles on the applicability of Bayesian updating can be dealt with by making explicit the exact events we update upon. We must choose the semantics so as to include all these events. Is that always possible?  B B p= 1 / 4 A ?  B B p= 1 / 4 A'

Judy Benjamin updates In updating a probability p to p  by distance minimisation under a partition of constraints  , we may have for some B and all . Now suppose that we can associate the constraints with a partition of events G  : 

No-representation theorem In Bayesian conditioning on events A  from a partition, the prior is always a convex combination of the posteriors: It thus seems that there is no set of events G  that can mimic distance minimisation on the constraints  . But because p(B|G  ) > p(B) for all but one , we have 

 In closing Some considerations for further research: There is a large gap between the epistemic puzzles and cases like model selection. It is unclear what kind of event is behind violations of the likelihood principle, as in the stopping rule. Probabilistic consistency may not be the only virtue if we object to a principled distinction between epistemology and logic.