James Hawthorne Workshop: Conditionals Counterfactuals and Causes In Uncertain Environments Düsseldorf 19/5/2011 – 22/5/2011 1.

Slides:



Advertisements
Similar presentations
Reasons for (prior) belief in Bayesian epistemology
Advertisements

Joe Levines Purple Haze. Physical/Phenomenal Gaps P = the complete microphysical truth Q = a phenomenal truth Q1: Is there an epistemic gap between.
Artificial Intelligence Chapter 13 The Propositional Calculus Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
A Scientific Argument for the Existence of God
The Logic of Intelligence Pei Wang Department of Computer and Information Sciences Temple University.
Hoare’s Correctness Triplets Dijkstra’s Predicate Transformers
Semantics of SL and Review Part 1: What you need to know for test 2 Part 2: The structure of definitions of truth functional notions Part 3: Rules when.
Ch 4: Difference Measurement. Difference Measurement In Ch 3 we saw the kind of representation you can get with a concatenation operation on an ordered.
Everything You Need to Know (since the midterm). Diagnosis Abductive diagnosis: a minimal set of (positive and negative) assumptions that entails the.
The Problem of Induction Reading: ‘The Problem of Induction’ by W. Salmon.
Lecture 6 Hyperreal Numbers (Nonstandard Analysis)
Reasoning Lindsay Anderson. The Papers “The probabilistic approach to human reasoning”- Oaksford, M., & Chater, N. “Two kinds of Reasoning” – Rips, L.
1.  Detailed Study of groups is a fundamental concept in the study of abstract algebra. To define the notion of groups,we require the concept of binary.
A D EFENSE OF T EMPERATE E PISTEMIC T RANSPARENCY ELEONORA CRESTO CONICET (Argentina) University of Konstanz – July 2011.
1 Basic abstract interpretation theory. 2 The general idea §a semantics l any definition style, from a denotational definition to a detailed interpreter.
Great Theoretical Ideas in Computer Science.
CSE115/ENGR160 Discrete Mathematics 02/07/12
Department of Computer Science© G.M.P O'Hare University College Dublin DEPARTMENT OF COMPUTER SCIENCE COMP 4.19Multi-Agent Systems(MAS) Lectures 19&20.
March 10: Quantificational Notions Interpretations: Every interpretation interprets every individual constant, predicate, and sentence letter of PL. Our.
Proving the implications of the truth functional notions  How to prove claims that are the implications of the truth functional notions  Remember that.
Theory and Applications
Foundations of Measurement Ch 3 & 4 April 4, 2011 presented by Tucker Lentz April 4, 2011 presented by Tucker Lentz.
Programming Language Semantics Denotational Semantics Chapter 5 Part III Based on a lecture by Martin Abadi.
Cantor’s Legacy: Infinity And Diagonalization Great Theoretical Ideas In Computer Science Steven RudichCS Spring 2004 Lecture 25Apr 13, 2004Carnegie.
On Bayesian Measures 27 May 2005 V. Crup i Vincenzo Crupi Department of Cognitive and Education Sciences Laboratory of Cognitive Sciences University of.
Copyright © Cengage Learning. All rights reserved. CHAPTER 11 ANALYSIS OF ALGORITHM EFFICIENCY ANALYSIS OF ALGORITHM EFFICIENCY.
The physical reductive explainability of phenomenal consciousness and the logical impossibility of zombies Marco Giunti University of Cagliari (Italy)
The ACL2 Proof Assistant Formal Methods Jeremy Johnson.
Solving fixpoint equations
LDK R Logics for Data and Knowledge Representation Context Logic Originally by Alessandro Agostini and Fausto Giunchiglia Modified by Fausto Giunchiglia,
Reduction Episode 9 0 The operation of reduction and the relation of reducibility Examples of reductions The variety of reduction concepts and their systematization.
Theoretical basis of GUHA Definition 1. A (simplified) observational predicate language L n consists of (i) (unary) predicates P 1,…,P n, and an infinite.
Numbers, Operations, and Quantitative Reasoning.
Math 3121 Abstract Algebra I Section 0: Sets. The axiomatic approach to Mathematics The notion of definition - from the text: "It is impossible to define.
Discrete dynamical systems and intrinsic computability Marco Giunti University of Cagliari, Italy
1 Chapter 7 Propositional and Predicate Logic. 2 Chapter 7 Contents (1) l What is Logic? l Logical Operators l Translating between English and Logic l.
Basics of Probability. A Bit Math A Probability Space is a triple, where  is the sample space: a non-empty set of possible outcomes; F is an algebra.
© Kenneth C. Louden, Chapter 11 - Functional Programming, Part III: Theory Programming Languages: Principles and Practice, 2nd Ed. Kenneth C. Louden.
The Integers. The Division Algorithms A high-school question: Compute 58/17. We can write 58 as 58 = 3 (17) + 7 This forms illustrates the answer: “3.
Logic CL4 Episode 16 0 The language of CL4 The rules of CL4 CL4 as a conservative extension of classical logic The soundness and completeness of CL4 The.
Pattern-directed inference systems
1 TABLE OF CONTENTS PROBABILITY THEORY Lecture – 1Basics Lecture – 2 Independence and Bernoulli Trials Lecture – 3Random Variables Lecture – 4 Binomial.
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
Mathematical Induction Chapter 16 Language, Proof and Logic.
Computational Semantics Day 5: Inference Aljoscha.
Functions Reading: Chapter 6 (94 – 107) from the text book 1.
© Kenneth C. Louden, Chapter 11 - Functional Programming, Part III: Theory Programming Languages: Principles and Practice, 2nd Ed. Kenneth C. Louden.
Albert Gatt LIN3021 Formal Semantics Lecture 4. In this lecture Compositionality in Natural Langauge revisited: The role of types The typed lambda calculus.
Theory and Applications
A Logic of Partially Satisfied Constraints Nic Wilson Cork Constraint Computation Centre Computer Science, UCC.
Incompleteness. System Relativity Soundness and completeness are properties of particular logical systems. There’s no sense to be made of the claim that.
1 Lecture 3 The Languages of K, T, B and S4. 2 Last time we extended the language PC to the language S5 by adding two new symbols ‘□’ (for ‘It is necessary.
Ch.3 Fuzzy Rules and Fuzzy Reasoning
Towards a Semantic Model for Java Wildcards Sophia Drossopoulou Mariangiola Dezani-Ciancaglini Imperial College London Università di Torino Italy Nicholas.
Section 1.7. Section Summary Mathematical Proofs Forms of Theorems Direct Proofs Indirect Proofs Proof of the Contrapositive Proof by Contradiction.
Chapter 7. Propositional and Predicate Logic
Unit-III Algebraic Structures
CSE15 Discrete Mathematics 02/08/17
The Propositional Calculus
Philosophy of Mathematics 1: Geometry
Introduction to Logic PHIL 240 Sections
On Kripke’s Alleged Proof of Church-Turing Thesis
MA/CSSE 474 More Math Review Theory of Computation
First-order (predicate) Logic
Logical Agents Chapter 7.
Logical Entailment Computational Logic Lecture 3
Knowledge Representation I (Propositional Logic)
Introduction to Fuzzy Set Theory
Herbrand Logic Semantics
Logical Agents Chapter 7 Andreas Geyer-Schulz and Chuck Dyer
Presentation transcript:

James Hawthorne Workshop: Conditionals Counterfactuals and Causes In Uncertain Environments Düsseldorf 19/5/2011 – 22/5/2011 1

Why Measure Evidential Support Probabilistically? That is:Why measure the support of H by E on a scale of real numbers between 0 and 1 that satisfies typical probabilistic axioms? More specifically, I advocate a version of Bayesian Confirmation Theory where the confirmation functions are so-called Popper Functions, and I think that evidential support is often best represented by sets of Popper Functions, because individual probabilistic support functions seem to be overly precise measures of evidential support. But isn’t the fact that probabilistic measures of support are overly precise just a symptom of the fact that probability functions are really ill-suited as measures of confirmation? More generally, why think that a probabilistic measure is at all the right sort of thing to use in measuring evidential support ? The qualitative logic of evidential support I’ll present offers an answer to these questions. =============================================================================== More generally, if the Popper Functions provide a useful version of the notion of conditional probability used in a given project X, then this qualitative logic may provide a useful theoretical account of what the probability functions in project X really represent. 2

Several Kinds of Answers: 1. Show what results follow: that the resulting probabilistic system supplies an account of confirmation that has intuitively desirable properties. 2. Argue from fundamental principles: (i) argue that the probabilistic axioms themselves are reasonable constraints on a measure of evidential support (i) (e.g. analyses of axioms ; Dutch Book arguments ; etc.) OR (ii) argue that the axiomatic system captures some deeper, more fundamental logic of evidential support that is itself a compelling account (e.g. Representation Theorem arguments). 3

Conditional Probability Functions (equivalent to those proposed by Karl Popper, Logic of Scientific Discovery, 1959) 0.for some E, F, P  [F | E]  < P  [A | B] < 1 2.if B |= A, then P  [A | B] = 1 3. if C |= B and B |= C, then P  [A | B] = P  [A | C] 4.P  [(A  B) | C] = P  [A | (B  C)]  P  [B | C] 5.if C |= ~(A  B), then P  [(A  B) | C] = P  [A | C] + P  [B | C] or P  [D | C] = 1 for all D 4

Popper Probability Functions (only for Sentential Logic Languages) (Karl Popper, Logic of Scientific Discovery, 1959) 0.each P  [A | B] a real number and for some E, F, G, H, P  [F | E]  P  [G | H] 1. P  [A | A]  P  [B | B] 2. P  [A | (B  C)]  P  [A | (C  B)] 3. P  [A | C]  P  [(A  B) | C] 4. P  [(A  B) | C] = P  [A | (B  C)]  P  [B | C] 5. P  [A | B] + P  [~A | B] = P  [B | B] unless P  [D | B] = P  [B | B] for all D 5

Popper-Field Probability Functions (Extends Popper Functions to Predicate Logic Languages) (Hartry Field, “Logic, Meaning, and Conceptual Role”, JP, 7/1977) Define a PF-Class M to be a set of functions P  such that 0.P  is a Popper Function (i.e. satisfies the previous rules) 1.P  [((Fc 1  Fc 2 ) ...  Fc m ) | B]  P  [  xFx | B] 2.if r > P  [  xFx | B] (for r a positive real), then for some P  in M defined on a name extension of P  ’s language L, there are names e 1,..., e n in P  ’s language such that r  P  [((Fe 1  Fe 2 ) ...  Fe n ) | B] (i.e. P  [  xFx | B] is a greatest lower bound on values of 2. P  [((Fe 1  Fe 2 ) ...  Fe n ) | B] in PF-Class M) The Popper-Field Functions on a language and its name extensions is just the union of all PF-Classes M on that language and its name extensions. 6

Several Kinds of Answers: 1. Show what results follow: that the resulting probabilistic system supplies an account of confirmation that has intuitively desirable properties. 2. Argue from fundamental principles: (i) argue that the probabilistic axioms themselves are reasonable constraints on a measure of evidential support (i) (e.g. analyses of axioms ; Dutch Book arguments ; etc.) OR (ii) argue that the axiomatic system captures some deeper, more fundamental logic of evidential support that is itself a compelling account (e.g. Representation Theorem arguments). 7

 Want to characterize comparative support relations ≽  of form  H 1 | E 1 ≽  H 2 | E 2 :  conclusion H 1 is supported by premise(s) E 1  at least as strongly as  conclusion H 2 is supported by premise(s) E 2  (under interpretation  ) Think of this notion of comparative support strength under an interpretation of a language (for predicate logic) as a basic semantic notion – i.e. as basic in the same way that the notion of truth under an interpretation of a language (for predicate logic) is a basic semantic notion. What semantic rules should all comparative support relations, ≽ , obey? That is, what semantic rules should constrain how logical terms behave within a comparative support relation ≽  ?  (in the way that semantic rules constrain how logical terms behave w.r.t the notion of truth under interpretation) 8

The semantic rules that govern ≽  should be intuitively plausible constraints on comparisons of evidential support strength. Each such relation ≽  should be (at least) a partial order -- perhaps not every pair of cases of support strength may be directly comparable. The semantic rules that govern ≽  should (if possible) not presuppose the notion of logical entailment – rather, logical entailment should fall out of the relations ≽  as a special case. It should turn out that quantitative conditional probability functions (i.e. Popper Functions) merely provide a convenient way to represent the comparative support relations. 9

Comparative Support Relation under an interpretation  :  H 1 |E 1 ≽  H 2 |E 2 : H 1 is supported by E 1 at least as strongly as H 2 is supported by E 2 Given a comparative support-strength relation ≽ , define four associated relations:  (1)H 1 |E 1 ≻  H 2 |E 2 abbreviates H 1 |E 1 ≽  H 2 |E 2 and not H 2 |E 2 ≽  H 1 |E 1 :  H 1 is supported by E 1 more strongly than H 2 is supported by E 2  (2)H 1 |E 1   H 2 |E 2 abbreviates H 1 |E 1 ≽  H 2 |E 2 and H 2 |E 2 ≽  H 1 |E 1 :  H 1 is supported by E 1 to the same extent that H 2 is supported by E 2  (3)H 1 |E 1 ≺≻  H 2 |E 2 abbreviates not H 1 |E 1 ≽  H 2 |E 2 and not H 2 |E 2 ≽  H 1 |E 1 :  the support-strength for H 1 by E 1 is not determinately comparable to that of H 2 by E 2.  (4)E   H abbreviates H|E ≽  E|E :  E supportively entails H It will turn out that: 1. each supportive entailment relation (i.e. for each ≽  ) is a so-called Rational Consequence Relation 2. indeed, each Rational Consequence Relation is represented by 2. a supportive entailment relation ( for some ≽  ) 3. Logical Entailment is just supportive entailment for every ≽  10

It should turn out that quantitative conditional probability functions (i.e. Popper Functions) are merely a convenient way to represent the comparative support relations. (That’s the main point of this project.) That is,... The rules that constrain the comparative support relations should be probabilistically sound in that each Popper Function should satisfy them. For each Popper Function P , define the corresponding comparative relation ≽  such that, for all sentences H 1, E 1, H 2, E 2, H 1 | E 1 ≽  H 2 | E 2 iff P  [H 1 | E 1 ]  P  [H 2 | E 2 ]. Then, for each Popper Function P , we want its corresponding comparative relation ≽  to be a comparative support relations – i.e. to satisfy the rules for the comparative support relations. And we want the rules that constrain the comparative support relations to be probabilistically complete in the sense that each comparative support relation ≽  that satisfies the specified rules should be representable by a Popper Function P . 11

The rules that constrain the comparative support relations should be probabilistically sound and complete in that each Popper Function should corresponds to a comparative support relation, and each comparative support relation ≽  (that satisfies the rules) should be representable by a Popper Function P . ================================================================= The strongest version of probabilistic representation for a comparative support relation would be this: Strong Probabilistic Representation:  For each comparative support relation ≽  (that satisfies the specified rules) there is a (unique) Popper function P  such that, for all H 1, E 1, H 2, E 2,  P  [H 1 | E 1 ]  P  [H 2 | E 2 ] if and only if H 1 | E 1 ≽  H 2 | E 2. Such a representation result will be forthcoming, but only for those comparative support relations ≽  that provide a complete order on comparative support-strength – i.e., only for those ≽  such that for all H 1, E 1, H 2, E 2,  either H 1 | E 1 ≽  H 2 | E 2 or H 2 | E 2 ≽  H 1 | E 1 (complete comparability) However, that is a very strong constraint on comparative support strength. I’ll characterize comparative support relations that need not satisfy this condition. Strong Probabilistic Representation will also require an Archimedean Condition – more on that next. 12

The rules that constrain the comparative support relations should be probabilistically sound and complete in that each Popper Function should corresponds to a comparative support relation, and each comparative support relation ≽  (that satisfies the rules) should be representable by a Popper Function P . ========================================================================== A weaker notion of probabilistic representation for comparative support relations would be this: Moderate Probabilistic Representation (but not permitting infinitesimally greater support ):  For each comparative support relation ≽  there is a Popper Function P  such that for all H 1, E 1, H 2, E 2,  (1) if H 1 | E 1 ≻  H 2 | E 2, then P  [ H 1 | E 1 ] > P  [ H 2 | E 2 ];  (2) if H 1 | E 1   H 2 | E 2, then P  [ H 1 | E 1 ] = P  [ H 2 | E 2 ]. (1) and (2) are jointly equivalent to the following conditions:  if P  [ H 1 | E 1 ] > P  [ H 2 | E 2 ], then H 1 | E 1 ≻  H 2 | E 2 or H 1 | E 1 ≺≻  H 2 | E 2. if P  [ H 1 | E 1 ] = P  [ H 2 | E 2 ], then H 1 | E 1   H 2 | E 2 or H 1 | E 1 ≺≻  H 2 | E 2. The fact that comparative support strength is a partial order implies that probabilistic representations of evidential support tend to be overly precise. Thus, evidential support is often represented by a set of conditional probability functions rather than by a single conditional probability function. In order to satisfy condition (1) the comparative support relations have to satisfy an Archimedean Condition – e.g. If H 1 | E 1 ≻  H 2 | E 2, then for an integer n  2 there are n sentences S 1,..., S n such that: not E 2   ~E 2, E 2   (S 1 ...  S n ), (for distinct i, j) E 2   ~(S i ·S j ), S i |E 2   S j |E 2, and H 1 | E 1 ≻  ( S i  H 2 )| E 2 13

The rules that constrain the comparative support relations should be probabilistically sound and complete in that each Popper Function should corresponds to a comparative support relation, and each comparative support relation ≽  (that satisfies the rules) should be representable by a Popper Function P . ========================================================================== An even weaker notion of probabilistic representation for comparative support relations will apply more generally to all such relations – including those that provide only Non-Archimedean partial orders on support strength. Weak Probabilistic Representation (permitting infinitesimally greater support ):  For each comparative support relation ≽  there is a Popper Function P  such that for all H 1, E 1, H 2, E 2,  (1) if H 1 | E 1 ≻  H 2 | E 2, then P  [ H 1 | E 1 ]  P  [ H 2 | E 2 ];  (2) if H 1 | E 1   H 2 | E 2, then P  [ H 1 | E 1 ] = P  [ H 2 | E 2 ]. (1) and (2) are jointly equivalent to the following condition:  if P  [ H 1 | E 1 ] > P  [ H 2 | E 2 ], then H 1 | E 1 ≻  H 2 | E 2 or H 1 | E 1 ≺≻  H 2 | E 2. Weak Probabilistic Representation still requires an axiom about partitions, but one that’s much weaker than the previous Archimedean Condition – e.g. an Arbitrarily Large Partitions Condition: For each integer m  2 there is an integer n  m such that for n sentences S 1,..., S n and some sentence E: not E   ~S 1, (for distinct i, j) E   ~(S i ·S j ), S i |E   S j |E. 14

0. for some H 1, E 2, H 2, E 2, H 1 |E 1 ≻  H 2 |E 2 (non-triviality) 1. B|B ≽  H|E(maximality) 2. If H 1 |E 1 ≽  H 2 |E 2, H 2 |E 2 ≽  H 3 |E 3, then H 1 |E 1 ≽  H 3 |E 3 (transitivity) 3. If E 1 |= E 2, E 2 |= E 1, then H|E 1 ≽  H|E 2 (Antecedent L-Equivalence)! 4. If H 2 |= H 1, then H 1 |E ≽  H 2 |E (Consequent L-Consequence) ! {rule 4 yields Reflexivity: H|E ≽  H|E} 5. If H 1 |E 1 ≽  H 2 |E 2, then ~H 2 |E 2 ≽  ~H 1 |E 1 or E 1   D (all D) (negation-symmetry)! 6. If H 1 |(E 1  F) ≽  H 2 |E 2 and H 1 |(E 1  ~F) ≽  H 2 |E 2, then H 1 |E 1 ≽  H 2 |E 2 (alternate presumption)* 7. If E   H, then E   ~F or (E  F)   H(Rational Monotonicity) 15

0. for some H 1, E 2, H 2, E 2, H 1 |E 1 ≻  H 2 |E 2 (non-triviality) 1. B|B ≽  H|E(maximality) 2. If H 1 |E 1 ≽  H 2 |E 2, H 2 |E 2 ≽  H 3 |E 3, then H 1 |E 1 ≽  H 3 |E 3 (transitivity) 3. H|(E 1  E 2 ) ≽  H|(E 2  E 1 )(Antecedent Commutivity) ! 4.1 (H  H)|E ≽  H|E(Consequent Repetition: CR) ! 4.2 H 1 |E 1 ≽  (H 1  H 2 )|E 1 (Simplification) ! { jointly yield Reflexivity: H|E ≽  H|E} 5.1 If H 1 |E 1 ≽  ~H 2 |E 2, then H 2 |E 2 ≽  ~H 1 |E 1 or E 1   D (all D) 5.2 If ~H 1 |E 1 ≽  H 2 |E 2, then ~H 2 |E 2 ≽  H 1 |E 1 or E 1   D (all D) (negation-symmetry) ! 6. If H 1 |(E 1  F) ≽  H 2 |E 2 and H 1 |(E 1  ~F) ≽  H 2 |E 2, then H 1 |E 1 ≽  H 2 |E 2 (alternate presumption)* 7. If E   H, then E   ~F or (E  F)   H(Rational Monotonicity) 16

8. If H 1 |(A 1 ·E 1 ) ≽  H 2 |(A 2 ·E 2 ), A 1 |E 1 ≽  A 2 |E 2, then (H 1 ·A 1 )|E 1 ≽  (H 2 ·A 2 )|E 2 (composition) 8.1 If H 1 |(A 1 ·E 1 ) ≻  H 2 |(A 2 ·E 2 ), A 1 |E 1 ≽  A 2 |E 2, E 2   ~(H 2 ·A 2 ), then (H 1 ·A 1 )|E 1 ≻  (H 2 ·A 2 )|E If H 1 |(A 1 ·E 1 ) ≽  H 2 |(A 2 ·E 2 ), A 1 |E 1 ≻  A 2 |E 2, E 2   ~(H 2 ·A 2 ), then (H 1 ·A 1 )|E 1 ≻  (H 2 ·A 2 )|E If H 1 |(A 1 ·E 1 ) ≺≻  H 2 |(A 2 ·E 2 ), E 2   ~(H 2 ·A 2 ), then (H 1 ·A 1 )|E 1 ≺≻  (H 2 ·A 2 )|E If A 1 |E 1 ≺≻  A 2 |E 2, E 2   ~(H 2 ·A 2 ), then (H 1 ·A 1 )|E 1 ≺≻  (H 2 ·A 2 )|E 2 ======================================================================== If (H 2 ·A 2 )|E 2 ≽  (H 1 ·A 1 )|E 1, A 1 |E 1 ≽  A 2 |E 2, E 2   ~(H 2 ·A 2 ), then H 2 |(A 2 ·E 2 ) ≽  H 1 |(A 1 ·E 1 ) If (H 2 ·A 2 )|E 2 ≽  (H 1 ·A 1 )|E 1, H 1 |(A 1 ·E 1 ) ≽  H 2 |(A 2 ·E 2 ), E 2   ~(H 2 ·A 2 ), then A 2 |E 2 ≽  A 1 |E 1 (decomposition) 17

9. If H 1 |(A 1 ·E 1 ) ≽  A 2 |E 2, A 1 |E 1 ≽  H 2 |(A 2 ·E 2 ), then (H 1 ·A 1 )|E 1 ≽  (H 2 ·A 2 )|E 2 (composition) 9.1 If H 1 |(A 1 ·E 1 ) ≻  A 2 |E 2, A 1 |E 1 ≽  H 2 |(A 2 ·E 2 ), E 2   ~(H 2 ·A 2 ), then (H 1 ·A 1 )|E 1 ≻  (H 2 ·A 2 )|E If H 1 |(A 1 ·E 1 ) ≽  A 2 |E 2, A 1 |E 1 ≻  H 2 |(A 2 ·E 2 ), E 2   ~(H 2 ·A 2 ), then (H 1 ·A 1 )|E 1 ≻  (H 2 ·A 2 )|E If H 1 |(A 1 ·E 1 ) ≺≻  A 2 |E 2, E 2   ~(H 2 ·A 2 ), then (H 1 ·A 1 )|E 1 ≺≻  (H 2 ·A 2 )|E 2 ========================================================================== If (H 2 ·A 2 )|E 2 ≽  (H 1 ·A 1 )|E 1, A 1 |E 1 ≽  H 2 |(A 2 ·E 2 ), E 2   ~(H 2 ·A 2 ), then A 2 |E 2 ≽  H 1 |(A 1 ·E 1 ) If (H 2 ·A 2 )|E 2 ≽  (H 1 ·A 1 )|E 1, H 1 |(A 1 ·E 1 ) ≽  A 2 |E 2, E 2   ~(H 2 ·A 2 ), then H 2 |(A 2 ·E 2 ) ≽  A 1 |E 1 (decomposition) 18

Bayes’ Theorem 1: Suppose B   ~H 1. If E|(H 1  B) ≻  E|(H 2  B), H 1 |B ≽  H 2 |B, then H 1 |(E  B) ≻  H 2 |(E  B). Bayes’ Theorem 2: Suppose B   ~H 1, B   ~H 2, B   ~(H 1  H 2 ). If E|(H 1  B) ≻  E|(H 2  B), then H 1 |(E  B  (H 1  H 2 )) ≻  H 1 |(B  (H 1  H 2 )) and H 2 |(B  (H 1  H 2 )) ≻  H 2 |(E  B  (H 1  H 2 )). ============================================================== Compare: Suppose P  [H 1 | B] > 0, P  [H 2 | B] > 0, P  [H 1  H 2 | B] = 0. If P  [E | H 1  B] > P  [E | H 2  B], then P  [H 1 | E  B  (H 1  H 2 )] P  [H 1 | E  B] P  [E | H 1  B] P  [H 1 | B] = = P  [H 2 | E  B  (H 1  H 2 )] P  [H 2 | E  B] P  [E | H 2  B] P  [H 2 | B] P  [H 1 | B] P  [H 1 | B  (H 1  H 2 )] > = P  [H 2 | B] P  [H 2 | B  (H 1  H 2 )] 19

Extendibility Rule(s): ≽  is extendable to a comparative support relation ≽  (i.e. whenever A|B ≻  C|D, A|B ≻  C|D ; whenever A|B   C|D, A|B   C|D ) such that ≽  satisfies all the other rules (1-9 thus far), and also satisfies the following rules: 10. H 1 |E 1 ≽  H 2 |E 2 or H 2 |E 2 ≽  H 1 |E 1 (complete comparability) ========================================================= Example of an relation ≽  that satisfies rules 1-9 but is not extendable to a relation ≽  that satisfies Complete Comparability can have H|(B  E) ≻  H|B and H|(B  ~ E) ≻  H|B {but only if E|B ≺≻  E|(H  B)} ========================================================== Theorem (from:1-5, 7-9): H|B ≺≻  H|(B  E) or H|B ≺≻  H|(B  ~E) or E|B ≺≻  E|(H  B) or H|(B  E) ≽  H|B ≽  H|(B  ~E) or H|(B  ~E) ≽  H|B ≽  H|(B  E). 20

Extendibility: ≽  is extendable to a comparative support relation ≽  (i.e. whenever A|B ≻  C|D, A|B ≻  C|D ; whenever A|B   C|D, A|B   C|D ) that satisfies all the other rules, and also satisfies the following rules: 10. H 1 |E 1 ≽  H 2 |E 2 or H 2 |E 2 ≽  H 1 |E 1 (complete comparability) {follows that} For any integer n > 1, if A 1,..., A n, and B 1,..., B n are such that  C   (A 1 ...  A n ), C   ~(A i  A j ), A n |C ≽ ... ≽  A 1 |C, not C   ~C,  D   (B 1 ...  B n ), D   ~(B i  B j ), B n |D ≽ ... ≽  B 1 |D, not D   ~D, then A n |C ≽  B 1 |D. (subdivision)* 11. For each integer m  2 there is an integer n  m such that for n sentences S 1,..., S n and some sentence E: not E   ~S 1, (for distinct i, j) E   ~(S i ·S j ), S i |E   S j |E. (arbitrarily large partitions) or, alternatively, If H 1 |E 1 ≻  H 2 |E 2, then for some integer n  2 there are n sentences S 1,..., S n such that: not E 2   ~S 1, (for distinct i, j) E 2   ~(S i ·S j ), S i |E 2   S j |E 2, E 2   (S 1 ...  S n ), and H 1 |E 1 ≻  (S i  H 2 )|E 2. The Representation Theorems hold for all comparative support relations that satisfy these rules. Strong Representation holds for each ≽  satisfying rules 0-10 and Moderate Representation holds for each ≽  satisfying rules 0-9 and extendable to rules Weak Representation holds for each ≽  satisfying rules 0-9 and extendable to

The version of the rules that does not presuppose the deductive logical entailment relation also needs explicit rules for quantifiers. The following rules are analogous to those proposed by Hartry Field for the Popper Functions. (Here ‘CS’ stands for “Comparative Support”.) Define a CS-Class M to be a set of relations ≽  such that 0. ≽  satisfies the comparative support rules ((Fc 1  Fc 2 ) ...  Fc m ) | B ≽   xFx | B 2. if A|B ≻   xFx | C, then for some ≽  in M defined on a name extension of   ’s language L, there are names e 1,..., e n in ≽  ’s language such that 3. A|B ≽  ((Fe 1  Fe 2 ) ...  Fe n ) | C (unless for all n  2, for any n sentences S 1,..., S n such that not C   ~S 1, (for distinct i, j) C   ~(S i ·S j ), S i |C   S j |C, C   (S 1 ...  S n ), we have (S i  xFx)|C ≻  A|B). 2.if ((Fe 1  Fe 2 ) ...  Fe n ) | C ≻  A|B for all ≽  a name extension of ≽  in M, for all names e 1,..., e n in ≽  ’s language, then  xFx | C ≽  A|B (unless A|B ≻   xFx|C, but for all n  2, for any n sentences S 1,..., S n such that not C   ~S 1, (for distinct i, j) C   ~(S i ·S j ), S i |C   S j |C, C   (S 1 ...  S n ), we have (S i  xFx)|C ≻  A|B). {need both 1 and 2 only because the relation ≽  need not be a complete order} The Comparative Support Relations on a language and its name extensions is just the union of all CS- Classes M (for that language and its name extensions) that are also extendable so as to satisfy rules 10 and 11 (rule 11 + for the Archimedean Comparative Support Relations). 22

The rules that constrain the comparative support relations are probabilistically sound and complete in that each Popper Function corresponds to a comparative support relation, and each comparative support relation ≽  (that satisfies the rules) is representable by a Popper Function P  as follow: Strong Probabilistic Representation (for Completely Comparable Archimedean CSRs): For each comparative support relation ≽  that satisfies 0-10 and 11 +, there is a unique Popper function P  such that, for all H 1, E 1, H 2, E 2, P  [H 1 | E 1 ]  P  [H 2 | E 2 ] if and only if H 1 | E 1 ≽  H 2 | E 2. Moderate Probabilistic Representation (for Archimedean CSRs): For each comparative support relation ≽  that satisfies 0-9 and is extendable to , there is a Popper Function P  such that for all H 1, E 1, H 2, E 2,  (1) if H 1 | E 1 ≻  H 2 | E 2, then P  [ H 1 | E 1 ] > P  [ H 2 | E 2 ];  (2) if H 1 | E 1   H 2 | E 2, then P  [ H 1 | E 1 ] = P  [ H 2 | E 2 ]. (1) and (2) are jointly equivalent to the following conditions:  if P  [ H 1 | E 1 ] > P  [ H 2 | E 2 ], then H 1 | E 1 ≻  H 2 | E 2 or H 1 | E 1 ≺≻  H 2 | E 2. if P  [ H 1 | E 1 ] = P  [ H 2 | E 2 ], then H 1 | E 1   H 2 | E 2 or H 1 | E 1 ≺≻  H 2 | E 2. Weak Probabilistic Representation (for CSRs that may permit infinitesimally greater support ): For each comparative support relation ≽  that satisfies 0-9 and is extendable to 10-11, there is a Popper Function P  such that for all H 1, E 1, H 2, E 2,  (1) if H 1 | E 1 ≻  H 2 | E 2, then P  [ H 1 | E 1 ]  P  [ H 2 | E 2 ];  (2) if H 1 | E 1   H 2 | E 2, then P  [ H 1 | E 1 ] = P  [ H 2 | E 2 ]. (1) and (2) are jointly equivalent to the following condition:  if P  [ H 1 | E 1 ] > P  [ H 2 | E 2 ], then H 1 | E 1 ≻  H 2 | E 2 or H 1 | E 1 ≺≻  H 2 | E 2. 23

When we permit a broad enough class of CSRs such that we merely have Weak Probabilistic Representation (where some CSRs may permit infinitesimally greater support ): For each comparative support relation ≽  that satisfies 0-9 and is extendable to 10-11, there is a Popper Function P  such that for all H 1, E 1, H 2, E 2,  (1) if H 1 | E 1 ≻  H 2 | E 2, then P  [ H 1 | E 1 ]  P  [ H 2 | E 2 ];  (2) if H 1 | E 1   H 2 | E 2, then P  [ H 1 | E 1 ] = P  [ H 2 | E 2 ]. (1) and (2) are jointly equivalent to the following condition:  if P  [ H 1 | E 1 ] > P  [ H 2 | E 2 ], then H 1 | E 1 ≻  H 2 | E 2 or H 1 | E 1 ≺≻  H 2 | E 2. Then, there exist CSRs, ≽ , such that: For a countably infinite set of sentences {S 1,..., S n,...} and a sentence F: not F   ~S 1 (i.e. S 1 |F ≻  ~F|F ), (for each distinct i, j) F   ~(S i ·S j ), S i |F   S j |F. Follows that for this CSRs, ≽  : For each integer n  2, for the n sentences S 1,..., S n and for a sentence E [i.e. for the sentence (F·(S 1  S 2 ...  S n )) ] we have: not E   ~S 1 (i.e. S 1 | E ≻  ~E|E ) and (for distinct i, j) E   ~(S i ·S j ), E   (S 1  S 2 ...  S n ) S i |E   S j |E. 24

James Hawthorne Workshop: Conditionals Counterfactuals and Causes In Uncertain Environments Düsseldorf 19/5/2011 – 22/5/

System R = P + (RM) The Preferential and Rational Consequence Relations Weaker than Usual Axioms 0. for some E, F, E |/~ F(Nontriv) 1. A |~ A(Reflex) 2. if C |= B, B |= C, B |~ A, then C |~ A(LCE) if C |~ B, B |= A, then C |~ A(RW) 4. if (C  B) |~ A and (C  ~B) |~ A, then C |~ A(WOR) 5. if C |~ (B  A), then (C  B) |~ A(VCM) ============================================================================================================================================ {O: if (C  ~B) |~ B, C |~ A, then C |~ (B  A) (WAND)} P:6. if C |~ B, C |~ A, then C |~ (B  A)(AND) ============================================================================================================================================= {Q: if C |~ A and (C  ~B) |/~ A, then (C  B) |~ A(NR)} R:7. if C |~ A, C |/~ ~B, then (C  B) |~ A(RM) 26

Conditional Probability Functions (equivalent to those proposed by Karl Popper, Logic of Scientific Discovery, 1959) 0.for some E, F, P  [F | E]  < P  [A | B] < 1 2.if B |= A, then P  [A | B] = 1 3. if C |= B and B |= C, then P  [A | B] = P  [A | C] 4.P  [(A  B) | C] = P  [A | (B  C)]  P  [B | C] 5.if C |= ~(A  B), then P  [(A  B) | C] = P  [A | C] + P  [B | C] or P  [D | C] = 1 for all D 27

Conditional Probability Functions (Janina Hosiasson-Lindenbaum, JSL, 12/1940) 1. 0 < P  [A | B] < 1 {and for some E, F, P  [F | E]  1} 2.if B |= A, then P  [A | B] = 1 3. if C |= B and B |= C, then P  [A | B] = P  [A | C] 4.P  [(A  B) | C] = P  [A | (B  C)]  P  [B | C] 5.if C |= ~(A  B), then P  [(A  B) | C] = P  [A | C] + P  [B | C] unless |= ~C {thus, unless P  [D | C] = 1 for all D} 28