Presentation is loading. Please wait.

Presentation is loading. Please wait.

Knowledge Repn. & Reasoning Lec #17: Continuous & Discrete UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2004 (Based on slides by Michael.

Similar presentations


Presentation on theme: "Knowledge Repn. & Reasoning Lec #17: Continuous & Discrete UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2004 (Based on slides by Michael."— Presentation transcript:

1 Knowledge Repn. & Reasoning Lec #17: Continuous & Discrete UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2004 (Based on slides by Michael Simon)

2 Today ● Multivariate Gaussian – Definitions – Marginalization & Conditioning ● Conditional Linear Gaussian Networks – NP Completeness ● Approximation Methods

3 Multivariate Gaussian ● Basic Density Definition: ● Where is an n x 1 vector and is an n x n symmetric matrix ● cov(x1,x1)… cov(x1,xn)  = : cov(xn,x1) … cov(xn,xn) Cov(x,y) =

4 Joint Distributions ● Partition x into x 1, x 2 (sizes p and q) and partition  and  similarly, turning the Gaussian into: ● Do the new  1,  2,  1, and  2 mean anything? ● Can we condition on this partitioning?

5 Marginalization and Conditioning ● What we want: ● How to get it: Partitioning ● Work with and

6 Partitioning Matrices ● ‘Block Diagonalization’ – Start with – Create a matrix which looks like – Multiply M to eliminate those blocks – Call the Schur Complement – Note (it will be useful later)

7 Partitioning Matrices Using E instead of H through the process leads us to:

8 Partitioning

9 Summary of Results

10 Conditional Linear Gaussian ● Discrete and Continuous nodes – Discrete nodes have Discrete Parents {D 1,D 2,...D k } – Continuous nodes {Y 1,Y 2,...,Y k } ● CPD for a continuous node: – ● Any CLG represents a Multivariate Gaussian

11 CLGs: NP Completeness ● For polytrees, CLG Inference is NP Complete ● Reduction from Subset-Sum – Given: S = {s 1,s 2,..., s n }, – Find: ● A i,B - Discrete (Uniform prior)

12 CLGs: NP Completeness Prove there’s a Subset iff

13 CLGs: NP Completeness Case 1: There is a SubsetCase 2: There isn’t a Subset Prove: P(B=1 | Y=L) > 0.5 Prove: P(B=1 | Y=L) < 0.5

14 CLGs: A Few More Points ● Direct Approximate Inference? – Don’t forget about C ● Theory niggles – What if there’s only one discrete ancestor? – Subset-Sum is Pseudo-Polynomial

15 CLGs: Approximate Inference ● In general... – NP completeness wins – Specific domains lead to exploitable structure ● Specific Domain: Fault Diagnosis – Faults are rare, so low-fault hypotheses are better

16 Fault Diagnosis ● Techniques – MCMC – Sampling ● However, neither works well for fault diagnosis – Doesn’t detect the rare faults ● So concentrate on specific hypotheses – Generate in order of prior likelyhood (higher probability for low # faults)

17 Some Results ● 5-Tank System – Sparse measurements – Rare failures of the pipes between tanks – Set up a specific failure and see if it can be diagnosed

18 Some Results ● Unrolled DBN – Only the Prior method found the correct P(failure) ● Combined with (Lerner, Parr, Koller ‘00) – Tracks the system extremely well


Download ppt "Knowledge Repn. & Reasoning Lec #17: Continuous & Discrete UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2004 (Based on slides by Michael."

Similar presentations


Ads by Google