CAUSAL REASONING FOR DECISION AIDING SYSTEMS COGNITIVE SYSTEMS LABORATORY UCLA Judea Pearl, Mark Hopkins, Blai Bonet, Chen Avin, Ilya Shpitser.

Slides:



Advertisements
Similar presentations
Department of Computer Science
Advertisements

Gated Graphs and Causal Inference
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Graphical Models BRML Chapter 4 1. the zoo of graphical models Markov networks Belief networks Chain graphs (Belief and Markov ) Factor graphs =>they.
Weakening the Causal Faithfulness Assumption
RELATED CLASS CS 262 Z – SEMINAR IN CAUSAL MODELING CURRENT TOPICS IN COGNITIVE SYSTEMS INSTRUCTOR: JUDEA PEARL Spring Quarter Monday and Wednesday, 2-4pm.
Outline 1)Motivation 2)Representing/Modeling Causal Systems 3)Estimation and Updating 4)Model Search 5)Linear Latent Variable Models 6)Case Study: fMRI.
1 WHAT'S NEW IN CAUSAL INFERENCE: From Propensity Scores And Mediation To External Validity Judea Pearl University of California Los Angeles (
CAUSES AND COUNTERFACTUALS Judea Pearl University of California Los Angeles (
Reasoning Under Uncertainty: Bayesian networks intro Jim Little Uncertainty 4 November 7, 2014 Textbook §6.3, 6.3.1, 6.5, 6.5.1,
1 THE SYMBIOTIC APPROACH TO CAUSAL INFERENCE Judea Pearl University of California Los Angeles (
THE MATHEMATICS OF CAUSAL MODELING Judea Pearl Department of Computer Science UCLA.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering An Introduction to Pearl’s Do-Calculus of Intervention Marco Valtorta Department.
RECONSIDERING DEFINITIONS OF DIRECT AND INDIRECT EFFECTS IN MEDIATION ANALYSIS WITH A SOLUTION FOR A CONTINUOUS FACTOR Ilya Novikov, Michal Benderly, Laurence.
COMMENTS ON By Judea Pearl (UCLA). notation 1990’s Artificial Intelligence Hoover.
From: Probabilistic Methods for Bioinformatics - With an Introduction to Bayesian Networks By: Rich Neapolitan.
THEORY AND APPLICATIONS OF CAUSAL REASONING CONGNITIVE SYSTEMS LABORATORY UCLA Judea Pearl and Jin Tian Model Correctness Mark Hopkins LAYER WIDTH: A New.
Regulatory Network (Part II) 11/05/07. Methods Linear –PCA (Raychaudhuri et al. 2000) –NIR (Gardner et al. 2003) Nonlinear –Bayesian network (Friedman.
Judea Pearl University of California Los Angeles CAUSAL REASONING FOR DECISION AIDING SYSTEMS.
Inferring Causal Graphs Computing 882 Simon Fraser University Spring 2002.
1 gR2002 Peter Spirtes Carnegie Mellon University.
Causal Models, Learning Algorithms and their Application to Performance Modeling Jan Lemeire Parallel Systems lab November 15 th 2006.
Learning In Bayesian Networks. Learning Problem Set of random variables X = {W, X, Y, Z, …} Training set D = { x 1, x 2, …, x N }  Each observation specifies.
1 CAUSAL INFERENCE: MATHEMATICAL FOUNDATIONS AND PRACTICAL APPLICATIONS Judea Pearl University of California Los Angeles (
1 WHAT'S NEW IN CAUSAL INFERENCE: From Propensity Scores And Mediation To External Validity Judea Pearl University of California Los Angeles (
Bayes Net Perspectives on Causation and Causal Inference
Bayes’ Nets  A Bayes’ net is an efficient encoding of a probabilistic model of a domain  Questions we can ask:  Inference: given a fixed BN, what is.
1 Tetrad: Machine Learning and Graphcial Causal Models Richard Scheines Joe Ramsey Carnegie Mellon University Peter Spirtes, Clark Glymour.
CAUSAL INFERENCE IN THE EMPIRICAL SCIENCES Judea Pearl University of California Los Angeles (
Summary of the Bayes Net Formalism David Danks Institute for Human & Machine Cognition.
1 REASONING WITH CAUSES AND COUNTERFACTUALS Judea Pearl UCLA (
Judea Pearl Computer Science Department UCLA DIRECT AND INDIRECT EFFECTS.
Judea Pearl University of California Los Angeles ( THE MATHEMATICS OF CAUSE AND EFFECT.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Judea Pearl University of California Los Angeles ( THE MATHEMATICS OF CAUSE AND EFFECT.
Reasoning Under Uncertainty: Bayesian networks intro CPSC 322 – Uncertainty 4 Textbook §6.3 – March 23, 2011.
Introduction to Bayesian Networks
UIUC CS 498: Section EA Lecture #21 Reasoning in Artificial Intelligence Professor: Eyal Amir Fall Semester 2011 (Some slides from Kevin Murphy (UBC))
REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA.
Course files
CAUSES AND COUNTERFACTIALS IN THE EMPIRICAL SCIENCES Judea Pearl University of California Los Angeles (
INTERVENTIONS AND INFERENCE / REASONING. Causal models  Recall from yesterday:  Represent relevance using graphs  Causal relevance ⇒ DAGs  Quantitative.
REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA.
Lecture 2: Statistical learning primer for biologists
THE MATHEMATICS OF CAUSE AND COUNTERFACTUALS Judea Pearl University of California Los Angeles (
Causal Model Ying Nian Wu UCLA Department of Statistics July 13, 2007 IPAM Summer School.
Judea Pearl Computer Science Department UCLA ROBUSTNESS OF CAUSAL CLAIMS.
Mediation: The Causal Inference Approach David A. Kenny.
1 CONFOUNDING EQUIVALENCE Judea Pearl – UCLA, USA Azaria Paz – Technion, Israel (
1 BN Semantics 2 – Representation Theorem The revenge of d-separation Graphical Models – Carlos Guestrin Carnegie Mellon University September 17.
1 Comments Yutaka Kano Osaka University
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2006 Readings: K&F: 3.1, 3.2, 3.3.
Summary: connecting the question to the analysis(es) Jay S. Kaufman, PhD McGill University, Montreal QC 26 February :40 PM – 4:20 PM National Academy.
1 Day 2: Search June 9, 2015 Carnegie Mellon University Center for Causal Discovery.
1 Day 2: Search June 14, 2016 Carnegie Mellon University Center for Causal Discovery.
Department of Computer Science
A Bayesian Approach to Learning Causal networks
Chen Avin Ilya Shpitser Judea Pearl Computer Science Department UCLA
Department of Computer Science
A MACHINE LEARNING EXERCISE
CSE 473: Artificial Intelligence Autumn 2011
CS 188: Artificial Intelligence Fall 2007
Computer Science and Statistics
CAUSAL INFERENCE IN STATISTICS
CS 188: Artificial Intelligence Spring 2007
From Propensity Scores And Mediation To External Validity
THE MATHEMATICS OF PROGRAM EVALUATION
Department of Computer Science
CAUSAL REASONING FOR DECISION AIDING SYSTEMS
Chapter 14 February 26, 2004.
Presentation transcript:

CAUSAL REASONING FOR DECISION AIDING SYSTEMS COGNITIVE SYSTEMS LABORATORY UCLA Judea Pearl, Mark Hopkins, Blai Bonet, Chen Avin, Ilya Shpitser

Judea Pearl Robustness of Causal Claims Ilya Shpitser and Chen Avin Experimental Testability of Counterfactuals Blai Bonet Logic-based Inference on Bayes Networks Mark Hopkins Inference using Instantiations Chen Avin Inference in Sensor Networks Blai Bonet Report from Probabilistic Planning Competition PRESENTATIONS

FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES Data joint distribution inferences from passive observations Probability and statistics deal with static relations ProbabilityStatistics Causal Model Data Causal assumptions 1.Effects of interventions 2.Causes of effects 3.Explanations Causal analysis deals with changes (dynamics) Experiments

Z Y X INPUTOUTPUT TYPICAL CAUSAL MODEL

TYPICAL CLAIMS 1.Effects of potential interventions, 2.Claims about attribution (responsibility) 3.Claims about direct and indirect effects 4.Claims about explanations

ROBUSTNESS: MOTIVATION The effect of smoking on cancer is, in general, non-identifiable (from observational studies). Smoking x y Genetic Factors (unobserved) Cancer u In linear systems: y =  x +   is non-identifiable. 

ROBUSTNESS: MOTIVATION Z – Instrumental variable; cov( z,u ) = 0 Smoking y Genetic Factors (unobserved) Cancer  u x Z Price of Cigarettes   is identifiable

ROBUSTNESS: MOTIVATION Problem with Instrumental Variables: The model may be wrong! Smoking Z Price of Cigarettes  x y Genetic Factors (unobserved) Cancer  u

Smoking ROBUSTNESS: MOTIVATION Z1Z1 Price of Cigarettes  Solution: Invoke several instruments Surprise:  1 =  2 model is likely correct x y Genetic Factors (unobserved) Cancer  u Peer Pressure Z2Z2 

ROBUSTNESS: MOTIVATION Z1Z1 Price of Cigarettes  x y Genetic Factors (unobserved) Cancer  u Peer Pressure Z2Z2  Smoking Greater surprise:  1 =  2 =  3 ….=  n = q Claim  = q is highly likely to be correct Z3Z3 ZnZn Anti-smoking Legislation

ROBUSTNESS: MOTIVATION xy Genetic Factors (unobserved) Cancer  u Smoking Symptoms do not act as instruments  remains non-identifiable s Symptom Why? Taking a noisy measurement ( s ) of an observed variable ( y ) cannot add new information

ROBUSTNESS: MOTIVATION x Genetic Factors (unobserved) Cancer  u Smoking Adding many symptoms does not help.  remains non-identifiable y Symptom S1S1 S2S2 SnSn

ROBUSTNESS: MOTIVATION Find if  can evoke an equality surprise  1 =  2 = …  n associated with several independent estimands of  x y  Given a parameter  in a general graph Formulate: Surprise, over-identification, independence Robustness: The degree to which  is robust to violations of model assumptions

ROBUSTNESS: FORMULATION Bad attempt: Parameter  is robust (over identifies) f 1, f 2 : Two distinct functions if:

ROBUSTNESS: FORMULATION exex eyey ezez xyz bc x = e x y = bx + e y z = cy + e z R yx = b R zx = bc R zy = c constraint: (b) (c) y → z irrelvant to derivation of b

RELEVANCE: FORMULATION Definition 8 Let A be an assumption embodied in model M, and p a parameter in M. A is said to be relevant to p if and only if there exists a set of assumptions S in M such that S and A sustain the identification of p but S alone does not sustain such identification. Theorem 2 An assumption A is relevant to p if and only if A is a member of a minimal set of assumptions sufficient for identifying p.

ROBUSTNESS: FORMULATION Definition 5 (Degree of over-identification) A parameter p (of model M ) is identified to degree k (read: k -identified) if there are k minimal sets of assumptions each yielding a distinct estimand of p.

ROBUSTNESS: FORMULATION xy b z c Minimal assumption sets for c. x y z c x y z c G3G3 G2G2 x y z c G1G1 Minimal assumption sets for b. x y b z

FROM MINIMAL ASSUMPTION SETS TO MAXIMAL EDGE SUPERGRAPHS FROM PARAMETERS TO CLAIMS Definition A claim C is identified to degree k in model M (graph G ), if there are k edge supergraphs of G that permit the identification of C, each yielding a distinct estimand. TE ( x,z ) = R zx TE ( x,z ) = R zx Rzy ·x x y z x y z e.g., Claim: (Total effect) TE (x,z) = q x y z

FROM MINIMAL ASSUMPTION SETS TO MAXIMAL EDGE SUPERGRAPHS FROM PARAMETERS TO CLAIMS Definition A claim C is identified to degree k in model M (graph G ), if there are k edge supergraphs of G that permit the identification of C, each yielding a distinct estimand. x y z x y z e.g., Claim: (Total effect) TE (x,z) = q x y z Nonparametric

CONCLUSIONS 1.Formal definition to ROBUSTNESS of causal claims. 2.Graphical criteria and algorithms for computing the degree of robustness of a given causal claim.