CS774. Markov Random Field : Theory and Application Lecture 02

Slides:



Advertisements
Similar presentations
Markov Networks Alan Ritter.
Advertisements

CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
Graphical Models BRML Chapter 4 1. the zoo of graphical models Markov networks Belief networks Chain graphs (Belief and Markov ) Factor graphs =>they.
Bayesian Networks, Winter Yoav Haimovitch & Ariel Raviv 1.
Approximation, Chance and Networks Lecture Notes BISS 2005, Bertinoro March Alessandro Panconesi University La Sapienza of Rome.
Introduction to Markov Random Fields and Graph Cuts Simon Prince
1 NP-completeness Lecture 2: Jan P The class of problems that can be solved in polynomial time. e.g. gcd, shortest path, prime, etc. There are many.
Approximating the Domatic Number Feige, Halldorsson, Kortsarz, Srinivasan ACM Symp. on Theory of Computing, pages , 2000.
Online Social Networks and Media. Graph partitioning The general problem – Input: a graph G=(V,E) edge (u,v) denotes similarity between u and v weighted.
CS774. Markov Random Field : Theory and Application Lecture 17 Kyomin Jung KAIST Nov
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Probabilistic Inference Lecture 1
CS774. Markov Random Field : Theory and Application Lecture 20 Kyomin Jung KAIST Nov
1 Fast Primal-Dual Strategies for MRF Optimization (Fast PD) Robot Perception Lab Taha Hamedani Aug 2014.
CS774. Markov Random Field : Theory and Application Lecture 04 Kyomin Jung KAIST Sep
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
CS774. Markov Random Field : Theory and Application Lecture 06 Kyomin Jung KAIST Sep
CSC5160 Topics in Algorithms Tutorial 2 Introduction to NP-Complete Problems Feb Jerry Le
Distributed Message Passing for Large Scale Graphical Models Alexander Schwing Tamir Hazan Marc Pollefeys Raquel Urtasun CVPR2011.
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
MRF Labeling With Graph Cut CMPUT 615 Nilanjan Ray.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 23 Instructor: Paul Beame.
Announcements Readings for today:
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 24 Instructor: Paul Beame.
Belief Propagation Kai Ju Liu March 9, Statistical Problems Medicine Finance Internet Computer vision.
CSE 421 Algorithms Richard Anderson Lecture 27 NP Completeness.
Computer vision: models, learning and inference
The Maximum Independent Set Problem Sarah Bleiler DIMACS REU 2005 Advisor: Dr. Vadim Lozin, RUTCOR.
Packing Element-Disjoint Steiner Trees Mohammad R. Salavatipour Department of Computing Science University of Alberta Joint with Joseph Cheriyan Department.
Minimum Spanning Trees. Subgraph A graph G is a subgraph of graph H if –The vertices of G are a subset of the vertices of H, and –The edges of G are a.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
The Theory of NP-Completeness 1. Nondeterministic algorithms A nondeterminstic algorithm consists of phase 1: guessing phase 2: checking If the checking.
V. V. Vazirani. Approximation Algorithms Chapters 3 & 22
CS774. Markov Random Field : Theory and Application Lecture 08 Kyomin Jung KAIST Sep
Efficient Gathering of Correlated Data in Sensor Networks
Presenter : Kuang-Jui Hsu Date : 2011/5/23(Tues.).
Nattee Niparnan. Easy & Hard Problem What is “difficulty” of problem? Difficult for computer scientist to derive algorithm for the problem? Difficult.
Graph Partitioning and Clustering E={w ij } Set of weighted edges indicating pair-wise similarity between points Similarity Graph.
CS774. Markov Random Field : Theory and Application Lecture 13 Kyomin Jung KAIST Oct
1 Treewidth, partial k-tree and chordal graphs Delpensum INF 334 Institutt fo informatikk Pinar Heggernes Speaker:
Message-Passing for Wireless Scheduling: an Experimental Study Paolo Giaccone (Politecnico di Torino) Devavrat Shah (MIT) ICCCN 2010 – Zurich August 2.
CS774. Markov Random Field : Theory and Application Lecture 21 Kyomin Jung KAIST Nov
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
A Clustering Algorithm based on Graph Connectivity Balakrishna Thiagarajan Computer Science and Engineering State University of New York at Buffalo.
CSE 024: Design & Analysis of Algorithms Chapter 9: NP Completeness Sedgewick Chp:40 David Luebke’s Course Notes / University of Virginia, Computer Science.
Markov Random Fields Probabilistic Models for Images
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
Complete Graphs A complete graph is one where there is an edge between every two nodes A C B G.
Non-Approximability Results. Summary -Gap technique -Examples: MINIMUM GRAPH COLORING, MINIMUM TSP, MINIMUM BIN PACKING -The PCP theorem -Application:
Lecture 2: Statistical learning primer for biologists
1 CS612 Algorithms for Electronic Design Automation CS 612 – Lecture 8 Lecture 8 Network Flow Based Modeling Mustafa Ozdal Computer Engineering Department,
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
CS774. Markov Random Field : Theory and Application Lecture 15 Kyomin Jung KAIST Oct
Network Theory: Community Detection Dr. Henry Hexmoor Department of Computer Science Southern Illinois University Carbondale.
Network Partition –Finding modules of the network. Graph Clustering –Partition graphs according to the connectivity. –Nodes within a cluster is highly.
CSE 421 Algorithms Richard Anderson Lecture 27 NP-Completeness Proofs.
Markov Random Fields in Vision
Approximation Algorithms Department of Mathematics and Computer Science Drexel University.
Biointelligence Laboratory, Seoul National University
Richard Anderson Lectures NP-Completeness
Richard Anderson Lecture 26 NP-Completeness
Richard Anderson Lecture 26 NP-Completeness
Markov Random Fields with Efficient Approximations
Richard Anderson Lecture 25 NP-Completeness
Richard Anderson Lecture 28 NP-Completeness
Markov Random Fields Presented by: Vladan Radosavljevic.
Richard Anderson Lecture 26 NP-Completeness
Constructing a m-connected k-Dominating Set in Unit Disc Graphs
Presentation transcript:

CS774. Markov Random Field : Theory and Application Lecture 02 Sep 08 2009 CS774. Markov Random Field : Theory and Application Lecture 02 Hello, my name is kyomin jung, and in this talk I’ll talk about learning complex networks, especially designing algorithms by utilizing structural properties of the system Kyomin Jung KAIST

Markov Random Field (MRF) :Definition 1 A collection of random variables , defined on a graph G is called an MRF iff The conditional probability distribution of at vertex is dependent only on its neighbors : Formally, MRF is a collection of random variables defined on a graph G, together with a probability distribution defined on it. Here a variable X_i is defined on a vertex I and each X_i takes value from a finite alphabet set, for example in the previous… Then the probability distribution should satisfy that … Graph G

Markov Random Field (MRF) :Definition 1 A collection of random variables , defined on a graph G is called an MRF iff The conditional probability distribution of at vertex is dependent only on its neighbors : Formally, MRF is a collection of random variables defined on a graph G, together with a probability distribution defined on it. Here a variable X_i is defined on a vertex I and each X_i takes value from a finite alphabet set, for example in the previous… Then the probability distribution should satisfy that … 2 1 3 6 Graph G

Markov Random Field (MRF) :Definition 1 A collection of random variables , defined on a graph G is called an MRF iff The conditional probability distribution of at vertex is dependent only on its neighbors : Formally, MRF is a collection of random variables defined on a graph G, together with a probability distribution defined on it. Here a variable X_i is defined on a vertex I and each X_i takes value from a finite alphabet set, for example in the previous… Then the probability distribution should satisfy that … 3 2 1 4 Graph G

Markov Random Field (MRF) :Definition 2 Given a graph G=(V,E), a subset S of V is called a cut of G if Removal of S from G induces two or more connected components. i.e. s.t. and , and there is no edge connecting A and B. Definition 2. X is called an MRF if for any cut S of G, and are conditionally independent given . Formally, MRF is a collection of random variables defined on a graph G, together with a probability distribution defined on it. Here a variable X_i is defined on a vertex I and each X_i takes value from a finite alphabet set, for example in the previous… Then the probability distribution should satisfy that … cut Graph G

Hammersley-Clifford Theorem A subset C of V is called a clique of G if the subgraph of G induced by C is a complete graph. Consider the following form of probability distribution X: Hammersley–Clifford theorem, any MRF that is strictly positive decomposes as a product form of functions associated with the cliques of the graph. Computing Z = computing prob for any specific assignment for some . Thm1. Above X satisfies the Definition of MRF. Thm2. Every positive MRF (i.e. P[X=x]>0 for all x) can be expressed by the above form.

Pair-wise MRF X is a pair-wise MRF if Hammersley–Clifford theorem, any MRF that is strictly positive decomposes as a product form of functions associated with the cliques of the graph. Computing Z = computing prob for any specific assignment for some and Z is called the partition function of the above expression.

Problem of Interest 1: Computing Maximum A Posteriori (MAP) MAP(Maximum A Posteriori) assignment Most likely assignment (mode of the distribution) NP-hard to compute in general Corresponds to an “optimization” problem Heuristics or approximation algorithms for specific MRFs are commonly used In the weather condition example, MAP is the most likely weather condition for all the states.

Example 1 : Image denoising We want to restore a binary (-1/+1) image Y of size with noise added. Consider Y as an element of Color: black or white We will use an MRF model to restore the original image. The underlying graph is a grid graph of size

Example 1 : Image denoising We will utilize two properties of the original image It is similar to Y. It is smooth, i.e. number of edges with different color is small. Define the following MRF, where MAP assignment : original image

Example 2 : Maximum Weight Independent Set (MWIS) Given a graph G=(V,E), a subset I of V is called an Independent Set, if for all , the two end points of e does not belong to I simultaneously. When the vertices are weighted, an independent set I is called MWIS if the sum of the weights of is maximum. Finding a MWIS is equivalent to finding a MAP in the following MRF on where , and Make it seem like a part III if otherwise is the weight at node v.

Example 2 : Maximum Weight Independent Set (MWIS) Has application to wireless networks with queue Interference in the network requires that the set of transmitters at each time must be an independent set The Maximum Weight Scheduling [Tassiulas, Ephremides ‘92] is abstracted as finding (approximate) MWIS with weights being the queue sizes of vertices Make it seem like a part III

Problem of Interest 2: Computing marginal probability Can be computed by computing partition functions of sub MRFs Note that computing Z of an MRF X is (poly time) computationally equivalent to computing P[X=x] for one x. NP-Hard to compute in general. NP-Hard