Marginal Independence and Conditional Independence Computer Science cpsc322, Lecture 26 (Textbook Chpt 6.1-2) March, 19, 2010.

Slides:



Advertisements
Similar presentations
Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
Advertisements

Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
Bayes Rule The product rule gives us two ways to factor a joint probability: Therefore, Why is this useful? –Can get diagnostic probability P(Cavity |
Reasoning under Uncertainty: Marginalization, Conditional Prob., and Bayes Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 6, 2013.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) March, 17, 2010.
CPSC 422 Review Of Probability Theory.
Probability.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
University College Cork (Ireland) Department of Civil and Environmental Engineering Course: Engineering Artificial Intelligence Dr. Radu Marinescu Lecture.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
CPSC 322, Lecture 29Slide 1 Reasoning Under Uncertainty: Bnet Inference (Variable elimination) Computer Science cpsc322, Lecture 29 (Textbook Chpt 6.4)
CPSC 322, Lecture 24Slide 1 Reasoning under Uncertainty: Intro to Probability Computer Science cpsc322, Lecture 24 (Textbook Chpt 6.1, 6.1.1) March, 15,
Uncertainty Chapter 13.
Uncertainty Chapter 13.
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Feb 28 and March 13-15, 2012.
Handling Uncertainty. Uncertain knowledge Typical example: Diagnosis. Consider:  x Symptom(x, Toothache)  Disease(x, Cavity). The problem is that this.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
Advanced Artificial Intelligence
Bayes Nets and Probabilities
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 15and20, 2012.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 20, 2012.
Probability and naïve Bayes Classifier Louis Oliphant cs540 section 2 Fall 2005.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CSE PR 1 Reasoning - Rule-based and Probabilistic Representing relations with predicate logic Limitations of predicate logic Representing relations.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 5, 2012.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Computer Science CPSC 322 Lecture 27 Conditioning Ch Slide 1.
Finish Probability Theory + Bayesian Networks Computer Science cpsc322, Lecture 9 (Textbook Chpt ) June, 5, CPSC 322, Lecture 9.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Uncertainty Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Outline [AIMA Ch 13] 1 Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty & Probability CIS 391 – Introduction to Artificial Intelligence AIMA, Chapter 13 Many slides adapted from CMSC 421 (U. Maryland) by Bonnie.
Computer Science cpsc322, Lecture 25
Reasoning Under Uncertainty: Belief Networks
CS 188: Artificial Intelligence Spring 2007
Marginal Independence and Conditional Independence
Bayesian networks (1) Lirong Xia Spring Bayesian networks (1) Lirong Xia Spring 2017.
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty Chapter 13.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
CS 4/527: Artificial Intelligence
Uncertainty.
Uncertainty in Environments
CS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence Fall 2007
Reasoning Under Uncertainty: Bnet Inference
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence Fall 2008
Class #21 – Monday, November 10
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Bayesian Networks CSE 573.
Bayesian networks (1) Lirong Xia. Bayesian networks (1) Lirong Xia.
Uncertainty Chapter 13.
Presentation transcript:

Marginal Independence and Conditional Independence Computer Science cpsc322, Lecture 26 (Textbook Chpt 6.1-2) March, 19, 2010

Lecture Overview Recap with Example –Marginalization –Conditional Probability –Chain Rule Bayes' Rule Marginal Independence Conditional Independence our most basic and robust form of knowledge about uncertain environments.

Recap Joint Distribution 3 binary random variables: P(H,S,F) –H dom(H)={h,  h} has heart disease, does not have… –S dom(S)={s,  s} smokes, does not smoke –F dom(F)={f,  f} high fat diet, low fat diet

Recap Joint Distribution 3 binary random variables: P(H,S,F) –H dom(H)={h,  h} has heart disease, does not have… –S dom(S)={s,  s} smokes, does not smoke –F dom(F)={f,  f} high fat diet, low fat diet s  s s s  s s f  f f h  h h

Recap Marginalization P(H,S)? P(H)? P(S)? s  s s s  s s f  f f h  h h

Recap Conditional Probability P(H,S)P(H) P(S) s  s s h  h h P(S|H) s  s s h  h h P(H|S)

Recap Conditional Probability (cont.) Two key points we covered in previous lecture We derived this equality from a possible world semantics of probability It is not a probability distributions but….. One for each configuration of the conditioning var(s)

Recap Chain Rule Bayes Theorem

Lecture Overview Recap with Example and Bayes Theorem Marginal Independence Conditional Independence

Do you always need to revise your beliefs? …… when your knowledge of Y’s value doesn’t affect your belief in the value of X DEF. Random variable X is marginal independent of random variable Y if, for all x i  dom(X), y k  dom(Y), P( X= x i | Y= y k ) = P(X= x i )

Marginal Independence: Example X and Y are independent iff: P(X|Y) = P(X) or P(Y|X) = P(Y) or P(X, Y) = P(X) P(Y) That is new evidence Y(or X) does not affect current belief in X (or Y) Ex: P(Toothache, Catch, Cavity, Weather) = P(Toothache, Catch, Cavity. JPD requiring entries is reduced to two smaller ones ( and )

In our example are Smoking and Heart Disease marginally Independent ? What our probabilities are telling us….? P(H,S)P(H) P(S) s  s s h  h h P(S|H) s  s s h  h h

Lecture Overview Recap with Example Marginal Independence Conditional Independence

With marg. Independence, for n independent random vars, O(2 n ) → Absolute independence is powerful but when you model a particular domain, it is ………. Dentistry is a large field with hundreds of variables, few of which are independent (e.g.,Cavity, Heart-disease). What to do?

Look for weaker form of independence P(Toothache, Cavity, Catch) Are Toothache and Catch marginally independent? BUT If I have a cavity, does the probability that the probe catches depend on whether I have a toothache? (1)P(catch | toothache, cavity) = What if I haven't got a cavity? (2) P(catch | toothache,  cavity) = Each is directly caused by the cavity, but neither has a direct effect on the other

Conditional independence In general, Catch is conditionally independent of Toothache given Cavity: P(Catch | Toothache,Cavity) = P(Catch | Cavity) Equivalent statements: P(Toothache | Catch, Cavity) = P(Toothache | Cavity) P(Toothache, Catch | Cavity) = P(Toothache | Cavity) P(Catch | Cavity)

Proof of equivalent statements

Conditional Independence: Formal Def. DEF. Random variable X is conditionally independent of random variable Y given random variable Z if, for all x i  dom(X), y k  dom(Y), z m  dom(Z) P( X= x i | Y= y k, Z= z m ) = P(X= x i | Z= z m ) That is, knowledge of Y’s value doesn’t affect your belief in the value of X, given a value of Z Sometimes, two variables might not be marginally independent. However, they become independent after we observe some third variable

Conditional independence: Use Write out full joint distribution using chain rule: P(Cavity, Catch, Toothache) = P(Toothache | Catch, Cavity) P(Catch | Cavity) P(Cavity) = P(Toothache | ) P(Catch | Cavity) P(Cavity) how many probabilities? The use of conditional independence often reduces the size of the representation of the joint distribution from exponential in n to linear in n. What is n? Conditional independence is our most basic and robust form of knowledge about uncertain environments.

Conditional Independence Example 2 Given whether there is/isn’t power in wire w0, is whether light l1 is lit or not, independent of the position of switch s2?

Conditional Independence Example 3 Is every other variable in the system independent of whether light l1 is lit, given whether there is power in wire w0 ?

CPSC 322, Lecture 4Slide 22 Learning Goals for today’s class You can: Derive the Bayes Rule Define and use Marginal Independence Define and use Conditional Independence

Where are we? (Summary) Probability is a rigorous formalism for uncertain knowledge Joint probability distribution specifies probability of every possible world Queries can be answered by summing over possible worlds For nontrivial domains, we must find a way to reduce the joint distribution size Independence (rare) and conditional independence (frequent) provide the tools

Next Class Bayesian Networks (Chpt 6.3)