Presentation is loading. Please wait.

Presentation is loading. Please wait.

THE MATHEMATICS OF CAUSAL MODELING Judea Pearl Department of Computer Science UCLA.

Similar presentations


Presentation on theme: "THE MATHEMATICS OF CAUSAL MODELING Judea Pearl Department of Computer Science UCLA."— Presentation transcript:

1 THE MATHEMATICS OF CAUSAL MODELING Judea Pearl Department of Computer Science UCLA

2 Modeling: Statistical vs. Causal Causal Models and Identifiability Inference to three types of claims: 1.Effects of potential interventions 2.Claims about attribution (responsibility) 3.Claims about direct and indirect effects Robustness of Causal Claims OUTLINE

3 TRADITIONAL STATISTICAL INFERENCE PARADIGM Data Inference Q(P) (Aspects of P) P Joint Distribution e.g., Infer whether customers who bought product A would also buy product B. Q = P(B|A)

4 THE CAUSAL INFERENCE PARADIGM Data Inference Q(M) (Aspects of M) Data Generating Model Some Q(M) cannot be inferred from P. e.g., Infer whether customers who bought product A would still buy A if we were to double the price. Joint Distribution

5 FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES Data joint distribution inferences from passive observations Probability and statistics deal with static relations ProbabilityStatistics

6 FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES Data joint distribution inferences from passive observations Probability and statistics deal with static relations ProbabilityStatistics Causal analysis deals with changes (dynamics) i.e. What remains invariant when P changes. P does not tell us how it ought to change e.g. Curing symptoms vs. curing diseases e.g. Analogy: mechanical deformation

7 FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES Data joint distribution inferences from passive observations Probability and statistics deal with static relations ProbabilityStatistics Causal Model Data Causal assumptions 1.Effects of interventions 2.Causes of effects 3.Explanations Causal analysis deals with changes (dynamics) Experiments

8 FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES (CONT) CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility 1.Causal and statistical concepts do not mix. 2. 3. 4.

9 CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility 1.Causal and statistical concepts do not mix. 4. 3.Causal assumptions cannot be expressed in the mathematical language of standard statistics. FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES (CONT) 2.No causes in – no causes out (Cartwright, 1989) statistical assumptions + data causal assumptions causal conclusions  }

10 4.Non-standard mathematics: a)Structural equation models (Wright, 1920; Simon, 1960) b)Counterfactuals (Neyman-Rubin (Y x ), Lewis (x Y)) CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility 1.Causal and statistical concepts do not mix. 3.Causal assumptions cannot be expressed in the mathematical language of standard statistics. FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES (CONT) 2.No causes in – no causes out (Cartwright, 1989) statistical assumptions + data causal assumptions causal conclusions  }

11 WHAT'S IN A CAUSAL MODEL? Oracle that assigns truth value to causal sentences: Action sentences:B if we do A. Counterfactuals:B would be different if A were true. Explanation:B occurred because of A. Optional: with what probability?

12 Z Y X INPUTOUTPUT FAMILIAR CAUSAL MODEL ORACLE FOR MANIPILATION

13 WHY CAUSALITY NEEDS SPECIAL MATHEMATICS Y = 2X X = 1 Y = 2 Process information Had X been 3, Y would be 6. If we raise X to 3, Y would be 6. Must “wipe out” X = 1. Static information SEM Equations are Non-algebraic:

14 CAUSAL MODELS AND CAUSAL DIAGRAMS Definition: A causal model is a 3-tuple M =  V,U,F  with a mutilation operator do(x): M  M x where: (i)V = {V 1 …,V n } endogenous variables, (ii)U = {U 1,…,U m } background variables (iii)F = set of n functions, f i : V \ V i  U  V i v i = f i (pa i,u i ) PA i  V \ V i U i  U

15 CAUSAL MODELS AND CAUSAL DIAGRAMS Definition: A causal model is a 3-tuple M =  V,U,F  with a mutilation operator do(x): M  M x where: (i)V = {V 1 …,V n } endogenous variables, (ii)U = {U 1,…,U m } background variables (iii)F = set of n functions, f i : V \ V i  U  V i v i = f i (pa i,u i ) PA i  V \ V i U i  U U1U1 U2U2 IW Q P PA Q

16 Definition: A causal model is a 3-tuple M =  V,U,F  with a mutilation operator do(x): M  M x where: (i)V = {V 1 …,V n } endogenous variables, (ii)U = {U 1,…,U m } background variables (iii)F = set of n functions, f i : V \ V i  U  V i v i = f i (pa i,u i ) PA i  V \ V i U i  U (iv)M x =  U,V,F x , X  V, x  X where F x = {f i : V i  X }  {X = x} (Replace all functions f i corresponding to X with the constant functions X=x) CAUSAL MODELS AND MUTILATION

17 CAUSAL MODELS AND MUTILATION Definition: A causal model is a 3-tuple M =  V,U,F  with a mutilation operator do(x): M  M x where: (i)V = {V 1 …,V n } endogenous variables, (ii)U = {U 1,…,U m } background variables (iii)F = set of n functions, f i : V \ V i  U  V i v i = f i (pa i,u i ) PA i  V \ V i U i  U U1U1 U2U2 IW Q P (iv)

18 CAUSAL MODELS AND MUTILATION Definition: A causal model is a 3-tuple M =  V,U,F  with a mutilation operator do(x): M  M x where: (i)V = {V 1 …,V n } endogenous variables, (ii)U = {U 1,…,U m } background variables (iii)F = set of n functions, f i : V \ V i  U  V i v i = f i (pa i,u i ) PA i  V \ V i U i  U (iv) U1U1 U2U2 IW Q P P = p 0 MpMp

19 Definition: A causal model is a 3-tuple M =  V,U,F  with a mutilation operator do(x): M  M x where: (i)V = {V 1 …,V n } endogenous variables, (ii)U = {U 1,…,U m } background variables (iii)F = set of n functions, f i : V \ V i  U  V i v i = f i (pa i,u i ) PA i  V \ V i U i  U (iv)M x =  U,V,F x , X  V, x  X where F x = {f i : V i  X }  {X = x} (Replace all functions f i corresponding to X with the constant functions X=x) Definition (Probabilistic Causal Model):  M, P(u)  P(u) is a probability assignment to the variables in U. PROBABILISTIC CAUSAL MODELS

20 CAUSAL MODELS AND COUNTERFACTUALS Definition: The sentence: “Y would be y (in situation u), had X been x,” denoted Y x (u) = y, means: The solution for Y in a mutilated model M x, (i.e., the equations for X replaced by X = x) and U=u, is equal to y. Joint probabilities of counterfactuals:

21 U D B C A S5.If the prisoner is dead, he would still be dead if A were not to have shot. D  D  A 3-STEPS TO COMPUTING COUNTERFACTUALS TRUE Abduction TRUE (Court order) (Captain) (Riflemen) (Prisoner)

22 U D B C A S5.If the prisoner is dead, he would still be dead if A were not to have shot. D  D  A 3-STEPS TO COMPUTING COUNTERFACTUALS TRUE U D B C A FALSE TRUE Action TRUE U D B C A FALSE TRUE PredictionAbduction TRUE

23 U D B C A P(S5).The prisoner is dead. How likely is it that he would be dead if A were not to have shot. P(D  A |D) = ? COMPUTING PROBABILITIES OF COUNTERFACTUALS Abduction TRUE Prediction U D B C A FALSE P(u|D) P(D  A |D ) P(u)P(u) Action U D B C A FALSE P(u|D)

24 CAUSAL INFERENCE MADE EASY (1985-2000) 1.Inference with Nonparametric Structural Equations made possible through Graphical Analysis. 2.Mathematical underpinning of counterfactuals through nonparametric structural equations 3.Graphical-Counterfactuals symbiosis

25 IDENTIFIABILITY Definition: Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption. Q is identifiable relative to A iff for all M 1, M 2, that satisfy A. P(M 1 ) = P(M 2 )   Q( M 1 ) = Q( M 2 )

26 IDENTIFIABILITY Definition: Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption. Q is identifiable relative to A iff In other words, Q can be determined uniquely from the probability distribution P(v) of the endogenous variables, V, and assumptions A. P(M 1 ) = P(M 2 )   Q( M 1 ) = Q( M 2 ) for all M 1, M 2, that satisfy A.

27 IDENTIFIABILITY Definition: Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption. Q is identifiable relative to A iff for all M 1, M 2, that satisfy A. P(M 1 ) = P(M 2 )   Q( M 1 ) = Q( M 2 ) A: Assumptions encoded in the diagram Q 1 : P(y|do(x)) Causal Effect (= P(Y x =y) ) Q 2 : P(Y x  =y | x, y) Probability of necessity Q 3 : Direct Effect In this talk:

28 THE FUNDAMENTAL THEOREM OF CAUSAL INFERENCE Causal Markov Theorem: Any distribution generated by Markovian structural model M (recursive, with independent disturbances) can be factorized as Where pa i are the (values of) the parents of V i in the causal diagram associated with M.

29 THE FUNDAMENTAL THEOREM OF CAUSAL INFERENCE Causal Markov Theorem: Any distribution generated by Markovian structural model M (recursive, with independent disturbances) can be factorized as Where pa i are the (values of) the parents of V i in the causal diagram associated with M. Corollary: (Truncated factorization, Manipulation Theorem) The distribution generated by an intervention do(X=x) (in a Markovian model M) is given by the truncated factorization

30 RAMIFICATIONS OF THE FUNDAMENTAL THEOREM U (unobserved) X = x Z Y SmokingTar in Lungs Cancer U (unobserved) X Z Y SmokingTar in Lungs Cancer Given P(x,y,z), should we ban smoking? Pre-interventionPost-intervention

31 RAMIFICATIONS OF THE FUNDAMENTAL THEOREM U (unobserved) X = x Z Y SmokingTar in Lungs Cancer U (unobserved) X Z Y SmokingTar in Lungs Cancer Given P(x,y,z), should we ban smoking? Pre-interventionPost-intervention To compute P(y,z|do(x)), we must eliminate u. (Graphical problem.)

32 THE BACK-DOOR CRITERION Graphical test of identification P(y | do(x)) is identifiable in G if there is a set Z of variables such that Z d-separates X from Y in G x. Z6Z6 Z3Z3 Z2Z2 Z5Z5 Z1Z1 X Y Z4Z4 Z6Z6 Z3Z3 Z2Z2 Z5Z5 Z1Z1 X Y Z4Z4 Z GxGx G

33 Pre-interventionPost-intervention RAMIFICATIONS OF THE FUNDAMENTAL THEOREM U (unobserved) X = x Z Y SmokingTar in Lungs Cancer U (unobserved) X Z Y SmokingTar in Lungs Cancer Given P(x,y,z), should we ban smoking?

34 THE BACK-DOOR CRITERION Graphical test of identification P(y | do(x)) is identifiable in G if there is a set Z of variables such that Z d-separates X from Y in G x. Z6Z6 Z3Z3 Z2Z2 Z5Z5 Z1Z1 X Y Z4Z4 Z6Z6 Z3Z3 Z2Z2 Z5Z5 Z1Z1 X Y Z4Z4 Z Moreover, P(y | do(x)) =   P(y | x,z) P(z) (“adjusting” for Z) z GxGx G

35 RULES OF CAUSAL CALCULUS Rule 1: Ignoring observations P(y | do{x}, z, w) = P(y | do{x}, w) Rule 2: Action/observation exchange P(y | do{x}, do{z}, w) = P(y | do{x},z,w) Rule 3: Ignoring actions P(y | do{x}, do{z}, w) = P(y | do{x}, w)

36 DERIVATION IN CAUSAL CALCULUS Smoking Tar Cancer P (c | do{s}) =  t P (c | do{s}, t) P (t | do{s}) =  s   t P (c | do{t}, s) P (s | do{t}) P(t |s) =  t P (c | do{s}, do{t}) P (t | do{s}) =  t P (c | do{s}, do{t}) P (t | s) =  t P (c | do{t}) P (t | s) =  s  t P (c | t, s) P (s) P(t |s) =  s   t P (c | t, s) P (s | do{t}) P(t |s) Probability Axioms Rule 2 Rule 3 Rule 2 Genotype (Unobserved)

37 RECENT RESULTS ON IDENTIFICATION Theorem (Tian 2002): We can identify P(v | do{x}) (x a singleton) if and only if there is no child Z of X connected to X by a bi-directed path. X ZZ Z k 1

38 Do-calculus is complete A complete graphical criterion available for identifying causal effects of any set on any set References: Shpitser and Pearl 2006 (AAAI, UAI) RECENT RESULTS ON IDENTIFICATION (Cont.)

39 OUTLINE Modeling: Statistical vs. Causal Causal models and identifiability Inference to three types of claims: 1.Effects of potential interventions, 2.Claims about attribution (responsibility) 3.

40 DETERMINING THE CAUSES OF EFFECTS (The Attribution Problem) Your Honor! My client (Mr. A) died BECAUSE he used that drug.

41 DETERMINING THE CAUSES OF EFFECTS (The Attribution Problem) Your Honor! My client (Mr. A) died BECAUSE he used that drug. Court to decide if it is MORE PROBABLE THAN NOT that A would be alive BUT FOR the drug! P(? | A is dead, took the drug) > 0.50

42 THE PROBLEM Theoretical Problems: 1.What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.”

43 THE PROBLEM Theoretical Problems: 1.What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.” Answer:

44 THE PROBLEM Theoretical Problems: 1.What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.” 2.Under what condition can PN(x,y) be learned from statistical data, i.e., observational, experimental and combined.

45 WHAT IS INFERABLE FROM EXPERIMENTS? Simple Experiment: Q = P(Y x = y | z) Z nondescendants of X. Compound Experiment: Q = P(Y X(z) = y | z) Multi-Stage Experiment: etc…

46 CAUSAL INFERENCE MADE EASY (1985-2000) 1.Inference with Nonparametric Structural Equations made possible through Graphical Analysis. 2.Mathematical underpinning of counterfactuals through nonparametric structural equations 3.Graphical-Counterfactuals symbiosis

47 AXIOMS OF CAUSAL COUNTERFACTUALS 1.Definiteness 2.Uniqueness 3.Effectiveness 4.Composition 5.Reversibility Y would be y, had X been x (in state U = u )

48 GRAPHICAL – COUNTERFACTUALS SYMBIOSIS Every causal model implies constraints on counterfactuals e.g., consistent, and readable from the graph. Every theorem in SEM is a theorem in N-R, and conversely.

49 CAN FREQUENCY DATA DECIDE LEGAL RESPONSIBILITY? Nonexperimental data: drug usage predicts longer life Experimental data: drug has negligible effect on survival ExperimentalNonexperimental do(x) do(x) x x Deaths (y) 16 14 2 28 Survivals (y) 984 986 998 972 1,0001,0001,0001,000 1.He actually died 2.He used the drug by choice Court to decide (given both data): Is it more probable than not that A would be alive but for the drug? Plaintiff: Mr. A is special.

50 TYPICAL THEOREMS (Tian and Pearl, 2000) Bounds given combined nonexperimental and experimental data Identifiability under monotonicity (Combined data) corrected Excess-Risk-Ratio

51 SOLUTION TO THE ATTRIBUTION PROBLEM (Cont) WITH PROBABILITY ONE P(y x | x,y) =1 From population data to individual case Combined data tell more that each study alone

52 OUTLINE Modeling: Statistical vs. Causal Causal models and identifiability Inference to three types of claims: 1.Effects of potential interventions, 2.Claims about attribution (responsibility) 3.Claims about direct and indirect effects

53 QUESTIONS ADDRESSED What is the semantics of direct and indirect effects? Can we estimate them from data? Experimental data?

54 1.Direct (or indirect) effect may be more transportable. 2.Indirect effects may be prevented or controlled. 3.Direct (or indirect) effect may be forbidden WHY DECOMPOSE EFFECTS? Pill Thrombosis Pregnancy +  + Gender Hiring Qualification

55 TOTAL, DIRECT, AND INDIRECT EFFECTS HAVE SIMPLE SEMANTICS IN LINEAR MODELS XZ Y c a b z = bx +  1 y = ax + cz +  2 a + bc bc a

56 z = f (x,  1 ) y = g (x, z,  2 ) XZ Y SEMANTICS BECOMES NONTRIVIAL IN NONLINEAR MODELS (even when the model is completely specified) Dependent on z ? Void of operational meaning?

57 z = f (x,  1 ) y = g (x, z,  2 ) XZ Y THE OPERATIONAL MEANING OF DIRECT EFFECTS “Natural” Direct Effect of X on Y: The expected change in Y per unit change of X, when we keep Z constant at whatever value it attains before the change. In linear models, NDE = Controlled Direct Effect

58 z = f (x,  1 ) y = g (x, z,  2 ) XZ Y THE OPERATIONAL MEANING OF INDIRECT EFFECTS “Natural” Indirect Effect of X on Y: The expected change in Y when we keep X constant, say at x 0, and let Z change to whatever value it would have under a unit change in X. In linear models, NIE = TE - DE

59 POLICY IMPLICATIONS (Who cares?) f GENDERQUALIFICATION HIRING What is the direct effect of X on Y? The effect of Gender on Hiring if sex discrimination is eliminated. indirect XZ Y IGNORE

60 ``The central question in any employment-discrimination case is whether the employer would have taken the same action had the employee been of different race (age, sex, religion, national origin etc.) and everything else had been the same’’ [ Carson versus Bethlehem Steel Corp. (70 FEP Cases 921, 7 th Cir. (1996))] x = male, x = female y = hire, y = not hire z = applicant’s qualifications LEGAL DEFINITIONS TAKE THE NATURAL CONCEPTION (FORMALIZING DISCRIMINATION) NO DIRECT EFFECT

61 SEMANTICS AND IDENTIFICATION OF NESTED COUNTERFACTUALS Consider the quantity Given  M, P(u) , Q is well defined Given u, Z x * (u) is the solution for Z in M x *, call it z is the solution for Y in M xz Can Q be estimated from data?

62 Example: Theorem: If there exists a set W such that GRAPHICAL CONDITION FOR EXPERIMENTAL IDENTIFICATION OF AVERAGE NATURAL DIRECT EFFECTS

63 HOW THE PROOF GOES? Proof: Each factor is identifiable by experimentation.

64 GRAPHICAL CRITERION FOR COUNTERFACTUAL INDEPENDENCE U3U3 U1U1 X Y Z U2U2 U3U3 U1U1 XZ Y U2U2 U3U3 U1U1 X Y U2U2 Z

65 GRAPHICAL CONDITION FOR NONEXPERIMENTAL IDENTIFICATION OF AVERAGE NATURAL DIRECT EFFECTS Identification conditions 1.There exists a W such that (Y Z | W) G XZ 2.There exist additional covariates that render all counterfactual terms identifiable.

66 Corollary 3: The average natural direct effect in Markovian models is identifiable from nonexperimental data, and it is given by IDENTIFICATION IN MARKOVIAN MODELS X Z Y

67 Theorem 5: The total, direct and indirect effects obey The following equality In words, the total effect (on Y) associated with the transition from x * to x is equal to the difference between the direct effect associated with this transition and the indirect effect associated with the reverse transition, from x to x *. RELATIONS BETWEEN TOTAL, DIRECT, AND INDIRECT EFFECTS

68 Y Z X W x*x* z * = Z x* (u) Nonidentifiable even in Markovian models GENERAL PATH-SPECIFIC EFFECTS (Def.) Y Z X W Form a new model,, specific to active subgraph g Definition: g -specific effect

69 ANSWERS TO QUESTIONS Graphical conditions for estimability from experimental / nonexperimental data. Graphical conditions hold in Markovian models

70 ANSWERS TO QUESTIONS Graphical conditions for estimability from experimental / nonexperimental data. Useful in answering new type of policy questions involving mechanism blocking instead of variable fixing. Graphical conditions hold in Markovian models

71 THE OVERRIDING THEME 1. Define Q(M) as a counterfactual expression 2. Determine conditions for the reduction 3. If reduction is feasible, Q is inferable. Demonstrated on three types of queries: Q 1 : P(y|do(x)) Causal Effect (= P(Y x =y) ) Q 2 : P(Y x = y | x, y) Probability of necessity Q 3 : Direct Effect

72 CONCLUSIONS Structural-model semantics enriched with logic + graphs leads to formal interpretation and practical assessments of wide variety of causal and counterfactual relationships.

73 Modeling: Statistical vs. Causal Causal Models and Identifiability Inference to three types of claims: 1.Effects of potential interventions 2.Claims about attribution (responsibility) 3.Claims about direct and indirect effects Actual Causation and Explanation Robustness of Causal Claims OUTLINE

74 ROBUSTNESS: MOTIVATION Smoking x y Genetic Factors (unobserved) Cancer u In linear systems: y =  x + u cov (x,u) = 0  is identifiable.  = R yx 

75 ROBUSTNESS: MOTIVATION The claim  R yx is sensitive to the assumption cov ( x,u ) = 0. Smoking x Genetic Factors (unobserved) Cancer  is non-identifiable if cov ( x,u ) ≠ 0. y u

76 ROBUSTNESS: MOTIVATION Z – Instrumental variable; cov ( z,u ) = 0 Smoking y Genetic Factors (unobserved) Cancer  u x Z Price of Cigarettes   is identifiable, even if cov (x,u) ≠ 0

77 ROBUSTNESS: MOTIVATION Smoking y Genetic Factors (unobserved) Cancer  u x Z Price of Cigarettes  Claim “  = R yx ” is likely to be true

78 Smoking ROBUSTNESS: MOTIVATION Z1Z1 Price of Cigarettes  Invoking several instruments If    =  1 =  2, claim “  =  0 ” is more likely correct x y Genetic Factors (unobserved) Cancer  u Peer Pressure Z2Z2 

79 ROBUSTNESS: MOTIVATION Z1Z1 Price of Cigarettes  x y Genetic Factors (unobserved) Cancer  u Peer Pressure Z2Z2  Smoking Greater surprise:  1 =  2 =  3 ….=  n = q Claim  = q is highly likely to be correct Z3Z3 ZnZn Anti-smoking Legislation

80 ROBUSTNESS: MOTIVATION Assume we have several independent estimands of , and x y  Given a parameter  in a general graph Find the degree to which  is robust to violations of model assumptions  1 =  2 = …  n

81 ROBUSTNESS: ATTEMPTED FORMULATION Bad attempt: Parameter  is robust (over-identified) f 1, f 2 : Two distinct functions if:

82 ROBUSTNESS: MOTIVATION xy Genetic Factors (unobserved) Cancer  u Smoking Is  robust if  0 =  1 ? s Symptom 

83 ROBUSTNESS: MOTIVATION xy Genetic Factors (unobserved) Cancer  u Smoking Symptoms do not act as instruments  remains non-identifiable if cov ( x,u ) ≠ 0 s Symptom Why? Taking a noisy measurement ( s ) of an observed variable ( y ) cannot add new information 

84 ROBUSTNESS: MOTIVATION x Genetic Factors (unobserved) Cancer  u Smoking Adding many symptoms does not help.  remains non-identifiable y Symptom S1S1 S2S2 SnSn

85 INDEPENDENT: BASED ON DISTINCT SETS OF ASSUMPTION  u zyx  u zyx EstimandAssumptiomsEstimandAssumptioms

86 RELEVANCE: FORMULATION Definition 8 Let A be an assumption embodied in model M, and p a parameter in M. A is said to be relevant to p if and only if there exists a set of assumptions S in M such that S and A sustain the identification of p but S alone does not sustain such identification. Theorem 2 An assumption A is relevant to p if and only if A is a member of a minimal set of assumptions sufficient for identifying p.

87 ROBUSTNESS: FORMULATION Definition 5 (Degree of over-identification) A parameter p (of model M ) is identified to degree k (read: k -identified) if there are k minimal sets of assumptions each yielding a distinct estimand of p.

88 ROBUSTNESS: FORMULATION xy b z c Minimal assumption sets for c. x y z c x y z c G3G3 G2G2 x y z c G1G1 Minimal assumption sets for b. x y b z

89 FROM MINIMAL ASSUMPTION SETS TO MAXIMAL EDGE SUPERGRAPHS FROM PARAMETERS TO CLAIMS Definition A claim C is identified to degree k in model M (graph G ), if there are k edge supergraphs of G that permit the identification of C, each yielding a distinct estimand. TE ( x,z ) = R zx TE ( x,z ) = R zx Rzy ·x x y z x y z e.g., Claim: (Total effect) TE (x,z) = q x y z

90 FROM MINIMAL ASSUMPTION SETS TO MAXIMAL EDGE SUPERGRAPHS FROM PARAMETERS TO CLAIMS Definition A claim C is identified to degree k in model M (graph G ), if there are k edge supergraphs of G that permit the identification of C, each yielding a distinct estimand. x y z x y z e.g., Claim: (Total effect) TE (x,z) = q x y z Nonparametric

91 SUMMARY OF ROBUSTNESS RESULTS 1.Formal definition to ROBUSTNESS of causal claims: “A claim is robust when it is insensitive to violations of some of the model assumptions relevant to substantiating that claim.” 2.Graphical criteria and algorithms for computing the degree of robustness of a given causal claim.


Download ppt "THE MATHEMATICS OF CAUSAL MODELING Judea Pearl Department of Computer Science UCLA."

Similar presentations


Ads by Google