Download presentation

Presentation is loading. Please wait.

Published byWillie Tuley Modified over 4 years ago

1
1 Learning Causal Structure from Observational and Experimental Data Richard Scheines Carnegie Mellon University

2
Causation, Statistics, and Experiments 2 Francis Bacon Galileo Galilei Sewall Wright Trygve Haavelmo Charles Spearman Udny Yule Sir Ronald A. Fisher Jerzy Neyman 15001600 ….. …… 1900 1930 19601990 Graphical Causal Models Potential Outcomes

3
3 Causal Graph G = {V,E} Each edge X Y represents a direct causal claim: X is a direct cause of Y relative to V Causal Graphs Years of Education Income Skills and Knowledge Years of Education

4
4 Causal Markov Axiom Acyclicity d-separation criterion Independence Oracle Causal Graph Z X Y1Y1 Z _||_ Y 1 | XZ _||_ Y 2 | X Z _||_ Y 1 | X,Y 2 Z _||_ Y 2 | X,Y 1 Y 1 _||_ Y 2 | XY 1 _||_ Y 2 | X,Z Y2Y2 Bridge Principles: Causal Graph over V Constraints on P(V)

5
5 Faithfulness Constraints on a probability distribution P generated by a causal structure G hold for all parameterizations of G. Revenues = aRate + cEconomy + Rev. Economy = bRate + Econ. Faithfulness: a -bc

6
6 Faithfulness Gene A Gene B Protein 24 + + - By evolutionary design: Gene A _||_ Protein 24 Air Temp Core Body Temp Homeostatic Regulator By evolutionary design: Air temp _||_ Core Body Temp Sampling Rate vs. Equilibration rate

7
7 Causal Structure Association TV Obesity TV Obesity TV C TV _||_ Obesity

8
8 Sweaters On Room Temperature Pre-experimental SystemPost Modeling Ideal Interventions Interventions on the Effect

9
9 Modeling Ideal Interventions Sweaters On Room Temperature Pre-experimental SystemPost Interventions on the Cause

10
10 Interventions & Causal Graphs Model an ideal intervention by adding an intervention variable outside the original system as a direct cause of its target. Pre-intervention graph Intervene on Income Soft Intervention Hard Intervention

11
11 Association underdetermines Causal Structure TV Obesity TV Obesity TV C TV _||_ Obesity Spurious Association

12
12 Randomization Association = Causation TV Obesity TV Obesity TV C TV _||_ Obesity Randomizer

13
13 Randomization Association = Causation Treatment _||_ Response Treatment Response Randomizer U Treatment Assignment Treatment _||_ Response | Dropout = no Treatment Response Randomizer U Dropout

14
14 Randomization Association = Causation Treatment _||_ Response Treatment Response Randomizer Treatment Assignment Belief

15
15 Experimental Control & Statistical Control X 3 _||_ X 1 | CX 3 _||_ X 1 | C(set) Statistically control for C Experimentally control for C X1X1 X3X3 C Randomizer X 3 _||_ X 1 | MX 3 _||_ X 1 | M(set) Statistically control for M Experimentally control for M X1X1 X3X3 M Randomizer

16
16 Experimental Control Statistical Control X 3 _||_ X 1 | M(set) Statistically control for M Experimentally control for M X1X1 X3X3 M Randomizer U X 3 _||_ X 1 | M X 3 _||_ X 1 | M(set) Statistically control for M Experimentally control for M Randomizer X 3 _||_ X 1 | M X1X1 X3X3 M U2U2 U1U1

17
17 Causal Model(V) X Y Z Structural Eqs.(V) or CPT(V) Experimental Setup(V) V = {O, M} P(M) Manipulated Causal Model M (V) X Y Z Structural Eqs. M (V) or CPT M (V) I P M (V) Data Sampling P(V) = f(Causal Model(V), Experimental Setup(V))

18
18 Experimental Setup(V) V = {O, M} P(M) P M (V) Data Statistical Inference Discovery Algorithm Equivalence Class of Causal Structures Causal Discovery General Assumptions -Markov, Faithfulness -Linearity -Gaussianity -Acyclicity -Etc.

19
19 Causal Discovery from Passive Observation PC, GES Patterns (Markov equivalence class - no latent confounding) FCI PAGs (Markov equivalence - including confounders and selection bias) CCD Linear cyclic models (no confounding) BPC Linear latent variable models Lingam unique DAG (no confounding – linear non-Gaussian – faithfulness not needed) LVLingam set of DAGs (confounders allowed) CyclicLingam set of DGs (cyclic models, no confounding) Non-linear additive noise models unique DAG

20
20 Causal Discovery from Manipulations/Interventions Do(X=x) : replace P(X | parents(X)) with P(X=x) = 1.0 Randomize(X): (replace P(X | parents(X)) with P M (X), e.g., uniform) Soft interventions (replace P(X | parents(X)) with P M (X | parents(X), I), P M (I)) Simultaneous interventions Sequential interventions Sequential, conditional interventions Time sensitive interventions Shock and run: Set X at time t, and then let the system run Clamp : Set X at time t, and hold it fixed until time t + What sorts of manipulation/interventions have been studied? X Y

21
21 Causal Discovery from Manipulations/Interventions Simultaneous Interventions Destroy Information Experimental Setup Randomize(X,Y) independently P M (V) X _||_ Y X Y Equivalence Class X Y X Y X Y X Y X Y X Y X Y

22
22 Causal Discovery from Manipulations/Interventions Simultaneous Interventions Destroy Information, but: Sequence of single interventions over N variables, N-1 experiments are needed to guarantee causal identification Sequence of simultaneous interventions: 2 log(N) + 1

23
23 Causal Discovery from Manipulations/Interventions Equivalence class oddities X Y True Model Experimental Setup Randomize(Y) P M (V) X _||_ Y XY I

24
24 Causal Discovery from Manipulations/Interventions Equivalence class oddities Experimental Setup Randomize(Y) P M (V) X _||_ Y X Y Equivalence Class X Y X Y X Y XY

25
25 Causal Discovery from Manipulations/Interventions Equivalence class oddities Experimental Setup Randomize(X,Y) independently P M (V) X _||_ Z Equivalence Class X is an ancestor of Z X has a path to Z not through Y

26
26 Issues Efficiently representing a wider array of information relevant to causal structure discovery, and then efficiently combining it to maximally constrain the possible explanations of data Rate of reaching equilibrium vs. rate of sampling Transportability Constructing appropriate variables from raw measurements High dimensionality

Similar presentations

OK

DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.

DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.

© 2018 SlidePlayer.com Inc.

All rights reserved.

To ensure the functioning of the site, we use **cookies**. We share information about your activities on the site with our partners and Google partners: social networks and companies engaged in advertising and web analytics. For more information, see the Privacy Policy and Google Privacy & Terms.
Your consent to our cookies if you continue to use this website.

Ads by Google

Ppt on geography of india Ppt on surface tension of water Ppt on any one mathematician pascal Export pdf to ppt online convert Converter pdf to ppt online Ppt on ozone layer protection Ppt on natural sources of acids Forms of energy for kids ppt on batteries Ppt on carburetor system Ppt on edge detection pdf