REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA.

Slides:



Advertisements
Similar presentations
Department of Computer Science
Advertisements

Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
RELATED CLASS CS 262 Z – SEMINAR IN CAUSAL MODELING CURRENT TOPICS IN COGNITIVE SYSTEMS INSTRUCTOR: JUDEA PEARL Spring Quarter Monday and Wednesday, 2-4pm.
Holland on Rubin’s Model Part II. Formalizing These Intuitions. In the 1920 ’ s and 30 ’ s Jerzy Neyman, a Polish statistician, developed a mathematical.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
1 WHAT'S NEW IN CAUSAL INFERENCE: From Propensity Scores And Mediation To External Validity Judea Pearl University of California Los Angeles (
ASSESSING CAUSAL QUANTITIES FROM EXPERIMENTAL AND NONEXPERIMENTAL DATA Judea Pearl Computer Science and Statistics UCLA
CAUSES AND COUNTERFACTUALS Judea Pearl University of California Los Angeles (
1 THE SYMBIOTIC APPROACH TO CAUSAL INFERENCE Judea Pearl University of California Los Angeles (
Linear Regression and Binary Variables The independent variable does not necessarily need to be continuous. If the independent variable is binary (e.g.,
TRYGVE HAAVELMO AND THE EMERGENCE OF CAUSAL CALCULUS Judea Pearl University of California Los Angeles (
THE MATHEMATICS OF CAUSAL MODELING Judea Pearl Department of Computer Science UCLA.
COMMENTS ON By Judea Pearl (UCLA). notation 1990’s Artificial Intelligence Hoover.
Causal Networks Denny Borsboom. Overview The causal relation Causality and conditional independence Causal networks Blocking and d-separation Excercise.
From: Probabilistic Methods for Bioinformatics - With an Introduction to Bayesian Networks By: Rich Neapolitan.
THEORY AND APPLICATIONS OF CAUSAL REASONING CONGNITIVE SYSTEMS LABORATORY UCLA Judea Pearl and Jin Tian Model Correctness Mark Hopkins LAYER WIDTH: A New.
Chapter 4 Probability.
Chapter 51 Experiments, Good and Bad. Chapter 52 Experimentation u An experiment is the process of subjecting experimental units to treatments and observing.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Causal Diagrams and the Identification of Causal Effects A presentation of.
Judea Pearl University of California Los Angeles CAUSAL REASONING FOR DECISION AIDING SYSTEMS.
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
1 CAUSAL INFERENCE: MATHEMATICAL FOUNDATIONS AND PRACTICAL APPLICATIONS Judea Pearl University of California Los Angeles (
SIMPSON’S PARADOX, ACTIONS, DECISIONS, AND FREE WILL Judea Pearl UCLA
CAUSES AND COUNTERFACTUALS OR THE SUBTLE WISDOM OF BRAINLESS ROBOTS.
1 WHAT'S NEW IN CAUSAL INFERENCE: From Propensity Scores And Mediation To External Validity Judea Pearl University of California Los Angeles (
Methods of Observation PS 204A, Week 2. What is Science? Science is: (think Ruse) Based on natural laws/empirical regularities. Based on natural laws/empirical.
CAUSAL INFERENCE IN THE EMPIRICAL SCIENCES Judea Pearl University of California Los Angeles (
1 REASONING WITH CAUSES AND COUNTERFACTUALS Judea Pearl UCLA (
Judea Pearl Computer Science Department UCLA DIRECT AND INDIRECT EFFECTS.
Judea Pearl University of California Los Angeles ( THE MATHEMATICS OF CAUSE AND EFFECT.
Judea Pearl University of California Los Angeles ( THE MATHEMATICS OF CAUSE AND EFFECT.
SIMPSON’S PARADOX, ACTIONS, DECISIONS, AND FREE WILL Judea Pearl UCLA
THE MATHEMATICS OF CAUSE AND EFFECT: With Reflections on Machine Learning Judea Pearl Departments of Computer Science and Statistics UCLA.
Probability. Statistical inference is based on a Mathematics branch called probability theory. If a procedure can result in n equally likely outcomes,
V13: Causality Aims: (1) understand the causal relationships between the variables of a network (2) interpret a Bayesian network as a causal model whose.
Chapter 4 Probability ©. Sample Space sample space.S The possible outcomes of a random experiment are called the basic outcomes, and the set of all basic.
Course files
CAUSES AND COUNTERFACTIALS IN THE EMPIRICAL SCIENCES Judea Pearl University of California Los Angeles (
REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA.
Conditional Probability Distributions Eran Segal Weizmann Institute.
THE MATHEMATICS OF CAUSE AND COUNTERFACTUALS Judea Pearl University of California Los Angeles (
Chapter 8: Simple Linear Regression Yang Zhenlin.
Impact Evaluation Sebastian Galiani November 2006 Causal Inference.
Logistic Regression. Linear regression – numerical response Logistic regression – binary categorical response eg. has the disease, or unaffected by the.
Causal Model Ying Nian Wu UCLA Department of Statistics July 13, 2007 IPAM Summer School.
Judea Pearl Computer Science Department UCLA ROBUSTNESS OF CAUSAL CLAIMS.
CAUSAL REASONING FOR DECISION AIDING SYSTEMS COGNITIVE SYSTEMS LABORATORY UCLA Judea Pearl, Mark Hopkins, Blai Bonet, Chen Avin, Ilya Shpitser.
Mediation: The Causal Inference Approach David A. Kenny.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
1 CONFOUNDING EQUIVALENCE Judea Pearl – UCLA, USA Azaria Paz – Technion, Israel (
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
1 BN Semantics 2 – Representation Theorem The revenge of d-separation Graphical Models – Carlos Guestrin Carnegie Mellon University September 17.
Summary: connecting the question to the analysis(es) Jay S. Kaufman, PhD McGill University, Montreal QC 26 February :40 PM – 4:20 PM National Academy.
Variable selection in Regression modelling Simon Thornley.
1 Day 2: Search June 9, 2015 Carnegie Mellon University Center for Causal Discovery.
Identification in Econometrics: A Way to Get Causal Information from Observations? Damien Fennell, LSE UCL, May 27, 2005.
CAUSAL INFERENCE IN STATISTICS: A Gentle Introduction Judea Pearl Departments of Computer Science and Statistics UCLA.
Methods of Presenting and Interpreting Information Class 9.
Explanation of slide: Logos, to show while the audience arrive.
Department of Computer Science
Judea Pearl University of California Los Angeles
Chen Avin Ilya Shpitser Judea Pearl Computer Science Department UCLA
Department of Computer Science
A MACHINE LEARNING EXERCISE
Computer Science and Statistics
CAUSAL INFERENCE IN STATISTICS
From Propensity Scores And Mediation To External Validity
THE MATHEMATICS OF PROGRAM EVALUATION
Department of Computer Science
CAUSAL REASONING FOR DECISION AIDING SYSTEMS
Presentation transcript:

REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA

Modeling: Statistical vs. Causal Causal Models and Identifiability Inference to three types of claims: 1.Effects of potential interventions 2.Claims about attribution (responsibility) 3.Claims about direct and indirect effects Actual Causation and Explanation Falsifiability and Corroboration OUTLINE

TRADITIONAL STATISTICAL INFERENCE PARADIGM Data Inference Q(P) (Aspects of P) P Joint Distribution e.g., Infer whether customers who bought product A would also buy product B. Q = P(B|A)

THE CAUSAL INFERENCE PARADIGM Data Inference Q(M) (Aspects of M) M Data-generating Model Some Q(M) cannot be inferred from P. e.g., Infer whether customers who bought product A would still buy A if we double the price.

FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES Data joint distribution inferences from passive observations Probability and statistics deal with static relations ProbabilityStatistics

FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES Data joint distribution inferences from passive observations Probability and statistics deal with static relations ProbabilityStatistics Causal analysis deals with changes (dynamics) i.e. What remains invariant when P changes. P does not tell us how it ought to change e.g. Curing symptoms vs. curing diseases e.g. Analogy: mechanical deformation

FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES Data joint distribution inferences from passive observations Probability and statistics deal with static relations ProbabilityStatistics Causal Model Data Causal assumptions 1.Effects of interventions 2.Causes of effects 3.Explanations Causal analysis deals with changes (dynamics) Experiments

FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES (CONT) CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility 1.Causal and statistical concepts do not mix

CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility 1.Causal and statistical concepts do not mix Causal assumptions cannot be expressed in the mathematical language of standard statistics. FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES (CONT) 2.No causes in – no causes out (Cartwright, 1989) statistical assumptions + data causal assumptions causal conclusions  }

CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility 1.Causal and statistical concepts do not mix. 4.Non-standard mathematics: a)Structural equation models (SEM) b)Counterfactuals (Neyman-Rubin) c)Causal Diagrams (Wright, 1920) 3.Causal assumptions cannot be expressed in the mathematical language of standard statistics. FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES (CONT) 2.No causes in – no causes out (Cartwright, 1989) statistical assumptions + data causal assumptions causal conclusions  }

WHAT'S IN A CAUSAL MODEL? Oracle that assigns truth value to causal sentences: Action sentences:B if we do A. Counterfactuals:B would be different if A were true. Explanation:B occurred because of A. Optional: with what probability?

Z Y X INPUTOUTPUT FAMILIAR CAUSAL MODEL ORACLE FOR MANIPILATION

CAUSAL MODELS AND CAUSAL DIAGRAMS Definition: A causal model is a 3-tuple M =  V,U,F  with a mutilation operator do(x): M  M x where: (i)V = {V 1 …,V n } endogenous variables, (ii)U = {U 1,…,U m } background variables (iii)F = set of n functions, f i : V \ V i  U  V i v i = f i (pa i,u i ) PA i  V \ V i U i  U

CAUSAL MODELS AND CAUSAL DIAGRAMS Definition: A causal model is a 3-tuple M =  V,U,F  with a mutilation operator do(x): M  M x where: (i)V = {V 1 …,V n } endogenous variables, (ii)U = {U 1,…,U m } background variables (iii)F = set of n functions, f i : V \ V i  U  V i v i = f i (pa i,u i ) PA i  V \ V i U i  U U1U1 U2U2 IW Q P PA Q

Definition: A causal model is a 3-tuple M =  V,U,F  with a mutilation operator do(x): M  M x where: (i)V = {V 1 …,V n } endogenous variables, (ii)U = {U 1,…,U m } background variables (iii)F = set of n functions, f i : V \ V i  U  V i v i = f i (pa i,u i ) PA i  V \ V i U i  U (iv)M x =  U,V,F x , X  V, x  X where F x = {f i : V i  X }  {X = x} (Replace all functions f i corresponding to X with the constant functions X=x) CAUSAL MODELS AND MUTILATION

CAUSAL MODELS AND MUTILATION Definition: A causal model is a 3-tuple M =  V,U,F  with a mutilation operator do(x): M  M x where: (i)V = {V 1 …,V n } endogenous variables, (ii)U = {U 1,…,U m } background variables (iii)F = set of n functions, f i : V \ V i  U  V i v i = f i (pa i,u i ) PA i  V \ V i U i  U U1U1 U2U2 IW Q P (iv)

CAUSAL MODELS AND MUTILATION Definition: A causal model is a 3-tuple M =  V,U,F  with a mutilation operator do(x): M  M x where: (i)V = {V 1 …,V n } endogenous variables, (ii)U = {U 1,…,U m } background variables (iii)F = set of n functions, f i : V \ V i  U  V i v i = f i (pa i,u i ) PA i  V \ V i U i  U (iv) U1U1 U2U2 IW Q P P = p 0 MpMp

Definition: A causal model is a 3-tuple M =  V,U,F  with a mutilation operator do(x): M  M x where: (i)V = {V 1 …,V n } endogenous variables, (ii)U = {U 1,…,U m } background variables (iii)F = set of n functions, f i : V \ V i  U  V i v i = f i (pa i,u i ) PA i  V \ V i U i  U (iv)M x =  U,V,F x , X  V, x  X where F x = {f i : V i  X }  {X = x} (Replace all functions f i corresponding to X with the constant functions X=x) Definition (Probabilistic Causal Model):  M, P(u)  P(u) is a probability assignment to the variables in U. PROBABILISTIC CAUSAL MODELS

CAUSAL MODELS AND COUNTERFACTUALS Definition: Potential Response The sentence: “Y would be y (in unit u), had X been x,” denoted Y x (u) = y, is the solution for Y in a mutilated model M x, with the equations for X replaced by X = x. (“unit-based potential outcome”)

CAUSAL MODELS AND COUNTERFACTUALS Definition: Potential Response The sentence: “Y would be y (in unit u), had X been x,” denoted Y x (u) = y, is the solution for Y in a mutilated model M x, with the equations for X replaced by X = x. (“unit-based potential outcome”) Joint probabilities of counterfactuals:

CAUSAL MODELS AND COUNTERFACTUALS Definition: Potential Response The sentence: “Y would be y (in unit u), had X been x,” denoted Y x (u) = y, is the solution for Y in a mutilated model M x, with the equations for X replaced by X = x. (“unit-based potential outcome”) Joint probabilities of counterfactuals: In particular:

U D B C A S5.If the prisoner is dead, he would still be dead if A were not to have shot. D  D  A 3-STEPS TO COMPUTING COUNTERFACTUALS TRUE Abduction TRUE (Court order) (Captain) (Riflemen) (Prisoner)

U D B C A S5.If the prisoner is dead, he would still be dead if A were not to have shot. D  D  A 3-STEPS TO COMPUTING COUNTERFACTUALS TRUE U D B C A FALSE TRUE Action TRUE U D B C A FALSE TRUE PredictionAbduction TRUE

U D B C A P(S5).The prisoner is dead. How likely is it that he would be dead if A were not to have shot. P(D  A |D) = ? COMPUTING PROBABILITIES OF COUNTERFACTUALS Abduction TRUE Prediction U D B C A FALSE P(u|D) P(D  A |D ) P(u)P(u) Action U D B C A FALSE P(u|D)

CAUSAL INFERENCE MADE EASY ( ) 1.Inference with Nonparametric Structural Equations made possible through Graphical Analysis. 2.Mathematical underpinning of counterfactuals through nonparametric structural equations 3.Graphical-Counterfactuals symbiosis

IDENTIFIABILITY Definition: Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption. Q is identifiable relative to A iff for all M 1, M 2, that satisfy A. P(M 1 ) = P(M 2 )   Q( M 1 ) = Q( M 2 )

IDENTIFIABILITY Definition: Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption. Q is identifiable relative to A iff In other words, Q can be determined uniquely from the probability distribution P(v) of the endogenous variables, V, and assumptions A. P(M 1 ) = P(M 2 )   Q( M 1 ) = Q( M 2 ) for all M 1, M 2, that satisfy A.

IDENTIFIABILITY Definition: Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption. Q is identifiable relative to A iff for all M 1, M 2, that satisfy A. P(M 1 ) = P(M 2 )   Q( M 1 ) = Q( M 2 ) A: Assumptions encoded in the diagram Q 1 : P(y|do(x)) Causal Effect (= P(Y x =y) ) Q 2 : P(Y x  =y | x, y) Probability of necessity Q 3 : Direct Effect In this talk:

THE FUNDAMENTAL THEOREM OF CAUSAL INFERENCE Causal Markov Theorem: Any distribution generated by Markovian structural model M (recursive, with independent disturbances) can be factorized as Where pa i are the (values of) the parents of V i in the causal diagram associated with M.

THE FUNDAMENTAL THEOREM OF CAUSAL INFERENCE Causal Markov Theorem: Any distribution generated by Markovian structural model M (recursive, with independent disturbances) can be factorized as Where pa i are the (values of) the parents of V i in the causal diagram associated with M. Corollary: (Truncated factorization, Manipulation Theorem) The distribution generated by an intervention do(X=x) (in a Markovian model M) is given by the truncated factorization

Pre-interventionPost-intervention RAMIFICATIONS OF THE FUNDAMENTAL THEOREM U (unobserved) X = x Z Y SmokingTar in Lungs Cancer U (unobserved) X Z Y SmokingTar in Lungs Cancer Given P(x,y,z), should we ban smoking?

RAMIFICATIONS OF THE FUNDAMENTAL THEOREM U (unobserved) X = x Z Y SmokingTar in Lungs Cancer U (unobserved) X Z Y SmokingTar in Lungs Cancer Given P(x,y,z), should we ban smoking? Pre-interventionPost-intervention

RAMIFICATIONS OF THE FUNDAMENTAL THEOREM U (unobserved) X = x Z Y SmokingTar in Lungs Cancer U (unobserved) X Z Y SmokingTar in Lungs Cancer Given P(x,y,z), should we ban smoking? Pre-interventionPost-intervention To compute P(y,z|do(x)), we must eliminate u. (graphical problem).

THE BACK-DOOR CRITERION Graphical test of identification P(y | do(x)) is identifiable in G if there is a set Z of variables such that Z d-separates X from Y in G x. Z6Z6 Z3Z3 Z2Z2 Z5Z5 Z1Z1 X Y Z4Z4 Z6Z6 Z3Z3 Z2Z2 Z5Z5 Z1Z1 X Y Z4Z4 Z GxGx G

THE BACK-DOOR CRITERION Graphical test of identification P(y | do(x)) is identifiable in G if there is a set Z of variables such that Z d-separates X from Y in G x. Z6Z6 Z3Z3 Z2Z2 Z5Z5 Z1Z1 X Y Z4Z4 Z6Z6 Z3Z3 Z2Z2 Z5Z5 Z1Z1 X Y Z4Z4 Z Moreover, P(y | do(x)) =   P(y | x,z) P(z) (“adjusting” for Z) z GxGx G

RULES OF CAUSAL CALCULUS Rule 1: Ignoring observations P(y | do{x}, z, w) = P(y | do{x}, w) Rule 2: Action/observation exchange P(y | do{x}, do{z}, w) = P(y | do{x},z,w) Rule 3: Ignoring actions P(y | do{x}, do{z}, w) = P(y | do{x}, w)

DERIVATION IN CAUSAL CALCULUS Smoking Tar Cancer P (c | do{s}) =  t P (c | do{s}, t) P (t | do{s}) =  s   t P (c | do{t}, s) P (s | do{t}) P(t |s) =  t P (c | do{s}, do{t}) P (t | do{s}) =  t P (c | do{s}, do{t}) P (t | s) =  t P (c | do{t}) P (t | s) =  s  t P (c | t, s) P (s) P(t |s) =  s   t P (c | t, s) P (s | do{t}) P(t |s) Probability Axioms Rule 2 Rule 3 Rule 2 Genotype (Unobserved)

OUTLINE Modeling: Statistical vs. Causal Causal models and identifiability Inference to three types of claims: 1.Effects of potential interventions, 2.Claims about attribution (responsibility) 3.

DETERMINING THE CAUSES OF EFFECTS (The Attribution Problem) Your Honor! My client (Mr. A) died BECAUSE he used that drug.

DETERMINING THE CAUSES OF EFFECTS (The Attribution Problem) Your Honor! My client (Mr. A) died BECAUSE he used that drug. Court to decide if it is MORE PROBABLE THAN NOT that A would be alive BUT FOR the drug! P(? | A is dead, took the drug) > 0.50

THE PROBLEM Theoretical Problems: 1.What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.”

THE PROBLEM Theoretical Problems: 1.What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.” Answer:

THE PROBLEM Theoretical Problems: 1.What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.” 2.Under what condition can PN(x,y) be learned from statistical data, i.e., observational, experimental and combined.

WHAT IS INFERABLE FROM EXPERIMENTS? Simple Experiment: Q = P(Y x = y | z) Z nondescendants of X. Compound Experiment: Q = P(Y X(z) = y | z) Multi-Stage Experiment: etc…

CAN FREQUENCY DATA DECIDE LEGAL RESPONSIBILITY? Nonexperimental data: drug usage predicts longer life Experimental data: drug has negligible effect on survival ExperimentalNonexperimental do(x) do(x) x x Deaths (y) Survivals (y) ,0001,0001,0001,000 1.He actually died 2.He used the drug by choice Court to decide (given both data): Is it more probable than not that A would be alive but for the drug? Plaintiff: Mr. A is special.

TYPICAL THEOREMS (Tian and Pearl, 2000) Bounds given combined nonexperimental and experimental data Identifiability under monotonicity (Combined data) corrected Excess-Risk-Ratio

SOLUTION TO THE ATTRIBUTION PROBLEM (Cont) WITH PROBABILITY ONE P(y x | x,y) =1 From population data to individual case Combined data tell more that each study alone

OUTLINE Modeling: Statistical vs. Causal Causal models and identifiability Inference to three types of claims: 1.Effects of potential interventions, 2.Claims about attribution (responsibility) 3.Claims about direct and indirect effects

QUESTIONS ADDRESSED What is the semantics of direct and indirect effects? Can we estimate them from data? Experimental data?

TOTAL, DIRECT, AND INDIRECT EFFECTS HAVE SIMPLE SEMANTICS IN LINEAR MODELS XZ Y c a b z = bx +  1 y = ax + cz +  2 a + bc bc a

z = f (x,  1 ) y = g (x, z,  2 ) XZ Y SEMANTICS BECOMES NONTRIVIAL IN NONLINEAR MODELS (even when the model is completely specified) Dependent on z ? Void of operational meaning?

z = f (x,  1 ) y = g (x, z,  2 ) XZ Y THE OPERATIONAL MEANING OF DIRECT EFFECTS “Natural” Direct Effect of X on Y: The expected change in Y per unit change of X, when we keep Z constant at whatever value it attains before the change. In linear models, NDE = Controlled Direct Effect

POLICY IMPLICATIONS (Who cares?) f GENDERQUALIFICATION HIRING What is the direct effect of X on Y? The effect of Gender on Hiring if sex discrimination is eliminated. indirect XZ Y IGNORE

z = f (x,  1 ) y = g (x, z,  2 ) XZ Y THE OPERATIONAL MEANING OF INDIRECT EFFECTS “Natural” Indirect Effect of X on Y: The expected change in Y when we keep X constant, say at x 0, and let Z change to whatever value it would have under a unit change in X. In linear models, NIE = TE - DE

``The central question in any employment-discrimination case is whether the employer would have taken the same action had the employee been of different race (age, sex, religion, national origin etc.) and everything else had been the same’’ [ Carson versus Bethlehem Steel Corp. (70 FEP Cases 921, 7 th Cir. (1996))] x = male, x = female y = hire, y = not hire z = applicant’s qualifications LEGAL DEFINITIONS TAKE THE NATURAL CONCEPTION (FORMALIZING DISCRIMINATION) NO DIRECT EFFECT

SEMANTICS AND IDENTIFICATION OF NESTED COUNTERFACTUALS Consider the quantity Given  M, P(u) , Q is well defined Given u, Z x * (u) is the solution for Z in M x *, call it z is the solution for Y in M xz Can Q be estimated from data?

ANSWERS TO QUESTIONS Graphical conditions for estimability from experimental / nonexperimental data. Graphical conditions hold in Markovian models

ANSWERS TO QUESTIONS Graphical conditions for estimability from experimental / nonexperimental data. Useful in answering new type of policy questions involving mechanism blocking instead of variable fixing. Graphical conditions hold in Markovian models

THE OVERRIDING THEME 1. Define Q(M) as a counterfactual expression 2. Determine conditions for the reduction 3. If reduction is feasible, Q is inferable. Demonstrated on three types of queries: Q 1 : P(y|do(x)) Causal Effect (= P(Y x =y) ) Q 2 : P(Y x = y | x, y) Probability of necessity Q 3 : Direct Effect

ACTUAL CAUSATION AND THE COUNTERFACTUAL TEST "We may define a cause to be an object followed by another,..., where, if the first object had not been, the second never had existed." Hume, Enquiry, 1748 Lewis (1973): "x CAUSED y " if x and y are true, and y is false in the closest non-x-world. Structural interpretation: (i) X(u)=x (ii) Y(u)=y (iii) Y x (u)  y for x  x

PROBLEM WITH THE COUNTERFACTUAL DEFINITION Back-up to shoot iff Captain does not shoot at 12:00 noon (Back-up) (Prisoner) (Captain) Y W X

PROBLEM WITH THE COUNTERFACTUAL DEFINITION Back-up to shoot iff Captain does not shoot at 12:00 noon (Back-up) (Prisoner) (Captain) Y W X Scenario:Captain shot before noon Prisoner is dead = 1 = 0 = 1

PROBLEM WITH THE COUNTERFACTUAL DEFINITION Back-up to shoot iff Captain does not shoot at 12:00 noon (Back-up) (Prisoner) (Captain) Y W X Scenario:Captain shot before noon Prisoner is dead QIs Captain’s shot the cause of death? AYes, but the counterfactual test fails! Intuition:Back-up might fall asleep – structural contingency = 1 = 0 = 1

SELECTED STRUCTURAL CONTINGENCIES AND SUSTENANCE x sustains y against W iff: (i) X(u) = x; (ii) Y(u) = y ; (iii) Y xw (u) = y for all w; and (iv) Y x w (u) = y for some x  x and some w Y W X = 1 = w = 0 = 1 Y W X = 0 = w = 0

Definition:The explanatory power of a proposition X=x relative to an observed event Y=y is given by P(K{x,y}|x), the pre-discovery probability of the set of contexts K in which x is the actual cause of y. FIRE AND Oxygen Match Oxygen Match K O,F = K M,F = K EP(O) = P(K|O) << 1EP(M) = P(K|M)  1 EXPLANATORY POWER (Halpern and Pearl, 2001)

CORRECTNESS and CORROBORATION Data D corroborates structure S if S is (i) falsifiable and (ii) compatible with D. Falsifiability: P*(S)  P* Types of constraints: 1. conditional independencies 2. inequalities (for restricted domains) 3. functional Constraints implied by S P*P* P*(S) D (Data) e.g., wxyz

FROM CORROBORATING MODELS TO CORROBORATING CLAIMS xyxy a e.g., An un-corroborated structure, a is identifiable. a = r YX Intuitively, claim a = r YX is not corroborated because the assumptions that entail the claim are not falsifiable. i.e., no data falsifies the assumption r s = 0. xyxy a rsrs a = r YX - r s

FROM CORROBORATING MODELS TO CORROBORATING CLAIMS e.g., xyzxy a a xyz A corroborated structure can imply uncorroborated claims.

FROM CORROBORATING MODELS TO CORROBORATING CLAIMS a xyz e.g., xyzxy a Some claims can be more corroborated than others. aa b b = r ZY is corroborated because the assumptions needed for entailing this claim constrain the data by b

bb FROM CORROBORATING MODELS TO CORROBORATING CLAIMS Definition: An identifiable claim C is corroborated by data if the union of all minimal sets of assumptions sufficient for identifying C is corroborated by the data. e.g., xyzxy aaaa xyz Some claims can be more corroborated than others.

a b a xyz b GRAPHICAL CRITERION FOR CORROBORATED CLAIMS Theorem: An identifiable claim C is corroborated by data if The intersection of all maximal supergraphs sufficient for identifying C is corroborated by the data. e.g., xyx a xyz

b GRAPHICAL CRITERION FOR CORROBORATED CLAIMS Theorem: An identifiable claim C is corroborated by data if The intersection of all maximal supergraphs sufficient for identifying C is corroborated by the data. e.g., b a xyzxyx a xyz ab a xyzxyx b a xyzxyx b a xyz Intersection: xyx Maximal supergraphs:

CONCLUSIONS Structural-model semantics enriched with logic + graphs leads to formal interpretation and practical assessments of wide variety of causal and counterfactual relationships.