Presentation is loading. Please wait.

Presentation is loading. Please wait.

Graphical Models and Applications CNS/EE148 Instructors: M.Polito, P.Perona, R.McEliece TA: C. Fanti.

Similar presentations


Presentation on theme: "Graphical Models and Applications CNS/EE148 Instructors: M.Polito, P.Perona, R.McEliece TA: C. Fanti."— Presentation transcript:

1 Graphical Models and Applications CNS/EE148 Instructors: M.Polito, P.Perona, R.McEliece TA: C. Fanti

2 Example from Medical Diagnostics Visit to Asia Tuberculosis or Cancer XRay Result Dyspnea BronchitisLung Cancer Smoking Patient Information Medical Difficulties Diagnostic Tests

3 What is a graphical model ? A graphical model is a way of representing probabilistic relationships between random variables. Variables are represented by nodes: Conditional (in)dependencies are represented by (missing) edges: Undirected edges simply give correlations between variables (Markov Random Field or Undirected Graphical model): Directed edges give causality relationships (Bayesian Network or Directed Graphical Model):

4 “Graphical models are a marriage between probability theory and graph theory. They provide a natural tool for dealing with two problems that occur throughout applied mathematics and engineering – uncertainty and complexity – and in particular they are playing an increasingly important role in the design and analysis of machine learning algorithms. Fundamental to the idea of a graphical model is the notion of modularity – a complex system is built by combining simpler parts.

5 Probability theory provides the glue whereby the parts are combined, ensuring that the system as a whole is consistent, and providing ways to interface models to data. The graph theoretic side of graphical models provides both an intuitively appealing interface by which humans can model highly- interacting sets of variables as well as a data structure that lends itself naturally to the design of efficient general-purpose algorithms. Many of the classical multivariate probabilistic systems studied in fields such as statistics, systems engineering, information theory, pattern recognition and statistical mechanics are special cases of the general graphical model formalism -- examples include mixture models, factor analysis, hidden Markov models, Kalman filters and Ising models.

6 The graphical model framework provides a way to view all of these systems as instances of a common underlying formalism. This view has many advantages -- in particular, specialized techniques that have been developed in one field can be transferred between research communities and exploited more widely. Moreover, the graphical model formalism provides a natural framework for the design of new systems.“ --- Michael Jordan, 1998.

7 (Picture by Zoubin Ghahramani and Sam Roweis) We already know many graphical models:

8 Plan for the class  Introduction to Graphical Models (Polito)  Basics on graphical models and statistics  Learning from data  Exact inference  Approximate inference  Applications to Vision (Perona)  Applications to Coding Theory (McEliece)  Belief Propagation and Spin Glasses

9 Plan for the class  Introduction to Graphical Models (Polito)  Basics on graphical models and statistics  Learning from data  Exact inference  Approximate inference  Applications to Vision (Perona)  Applications to Coding Theory (McEliece)  Belief Propagation and Spin Glasses

10 Basics on graphical models and statistics Basics of graph theory. Families of probability distributions associated to directed and undirected graphs. Markov properties and conditional independence. Statistical concepts as building blocks for graphical models. Density estimation, classification and regression.

11 Basics on graphical models and statistics Graphs and Families of Probability Distributions There is a family of probability distributions that can be represented with this graph. X1 X4 X2 X3 X5 X6 X7 1) Every P.D. presenting (at least) the conditional independencies that can be derived from the graph belongs to that family. 2) Every P.D. that can be factorized as p(x1,…,x7)=p(x4|x1,x2) p(x7|x4) p(x5|x4) p(x6|x5,x2) p(x3|x2) belongs to that family.

12 Basics on graphical models and statistics Building blocks for graphical models X Y p(X)= ??? p(Y|X)= ???? Bayesian approach: every unknown quantity (including parameters) is treated as a random variable. X Density estimation Regression Classification Parametric and nonparametric methods Linear, conditional mixture, nonparametric Generative and discriminative approach Q X Q X XY  X X

13 Plan for the class  Introduction to Graphical Models (Polito)  Basics on graphical models and statistics  Learning from data  Exact inference  Approximate inference  Applications to Vision (Perona)  Applications to Coding Theory (McEliece)  Belief Propagation and Spin Glasses

14 Learning from data Model structure and parameters estimation. Complete observations and latent variables. MAP and ML estimation. The EM algorithm. Model selection.

15 StructureObservabilityMethod KnownFullML or MAP estimation KnownPartialEM algorithm UnknownFullModel selection or model averaging UnknownPartialEM + model sel. or model aver.

16 Plan for the class  Introduction to Graphical Models (Polito)  Basics on graphical models and statistics  Learning from data  Exact inference  Approximate inference  Applications to Vision (Perona)  Applications to Coding Theory (McEliece)  Belief Propagation and Spin Glasses

17 Exact inference The junction tree and related algorithms. Belief propagation and belief revision. The generalized distributive law. Hidden Markov Models and Kalmann Filtering with graphical models.

18 Exact inference Conditional independencies Given a probability distribution p(X,Y,Z,W), How can we decide if the groups of variables X and Y are “conditionally independent” from each other once the value of the variables Z is assigned ? With graphical models, we can implement an algorithm reducing this global problem to a series of local problems (see Matlab demo of the Bayes-Ball algorithm)

19 Exact inference Variable elimination and Distributive Law X1 X4 X2 X3 X5 X7 X6 X5 X4 X3 p(x1,…,x6,x7)=p(x4|x1,x2)p(x6|x4)p(x5|x4) p(x3|x2)p(x7|x2,x5) Marginalize over x7: p(x1,…,x6)=  x7 [p(x4|x1,x2)p(x6|x4)p(x5|x4) p(x3|x2)p(x7|x2,x5)] Applying a “distributive law”: p(x1,…,x6)=  p(x4|x1,x2)p(x6|x4)p(x5|x4) p(x3|x2)  x7 [ p(x7|x2,x5)] The language of graphical models allows a general formalization of this method.

20 Exact inference Junction graph and message passing Group random variables which are “fully connected”. Connect group-nodes with common members: the “junction graph”. Every node only needs to “communicate” with its neighbors. If the junction graph is a tree, there is a “message passing” protocol which allows exact inference.

21 Plan for the class  Introduction to Graphical Models (Polito)  Basics on graphical models and statistics  Learning from data  Exact inference  Approximate inference  Applications to Vision (Perona)  Applications to Coding Theory (McEliece)  Belief Propagation and Spin Glasses

22 Approximate inference Kullback-Leibler divergence and entropy. Variational methods. Monte Carlo Methods. Loopy junction graphs and loopy belief propagation. Performance of loopy belief propagation. Bethe approximation of free energy and belief propagation.

23 Approximate inference Kullback-Leibler divergence and GM The graphical model associated to a P.D. p(x) is too complicated. AND NOW ?!?!? We choose to approximate p(x) with q(x), obtained by making assumptions on the junction graph corresponding to the graphical model. Example: eliminate loops, bound the number of nodes linked to each node. A good criterion for choosing q: mimimize the cross- entropy, or Kullback-Leibler divergence:

24 Approximate inference Variational methods Example: the QMR-DT database By using the inequality: We get the approximation: The node fi is “unlinked”: Diseases: d1,d2,d3 Symptoms: f1,f2,f3,f4 p(f|d)=  p(fi|d)  p(di)

25 Approximate inference Loopy belief propagation When the junction graph is not a tree, inference is not exact: roughly speaking, a message might pass through a node more often, causing trouble. However, in certain cases, an iterated application of a message passing algorithm converges to a good candidate for the exact solution.


Download ppt "Graphical Models and Applications CNS/EE148 Instructors: M.Polito, P.Perona, R.McEliece TA: C. Fanti."

Similar presentations


Ads by Google