Presentation is loading. Please wait.

Presentation is loading. Please wait.

Optimization of Pearl’s Method of Conditioning and Greedy-Like Approximation Algorithm for the Vertex Feedback Set Problem Authors: Ann Becker and Dan.

Similar presentations


Presentation on theme: "Optimization of Pearl’s Method of Conditioning and Greedy-Like Approximation Algorithm for the Vertex Feedback Set Problem Authors: Ann Becker and Dan."— Presentation transcript:

1 Optimization of Pearl’s Method of Conditioning and Greedy-Like Approximation Algorithm for the Vertex Feedback Set Problem Authors: Ann Becker and Dan Geiger Presented by: Igor Kviatkovsky

2 Outline Introduction The Loop Cutset (LC) problem Weighted Vertex Feedback Set (WVFS) problem Reduction from LC to WVFS MGA (Modified Greedy Algorithm) –2-approximation for WVFS

3 Introduction Pearl’s method of conditioning is one of the known inference methods for Bayesian networks Find a set of vertices such that once the corresponding variables are instantiated, the remaining network is singly–connected –Pearl’s UPDATE-TREE procedure can be applied A L C I D J B M E K Fixing value of A & B L C I J M E K D a b ba

4 Introduction Fixing value of A & B & L breaks all loops. But can we choose less variables to break all loops? Are there better variables to choose than others? Motivation(Geiger&Becker): –choose the vertices which break more loops –vertices with higher degree are more likely to break more loops (vertice J with degree 3 breaks 3 loops!) A L C I D J B E K M

5 The Loop Cutset Problem Definitions: The underlying graph G of a directed graph D is the undirected graph formed by ignoring the directions of the edges in D. A loop in D is a subgraph of D whose underlying graph is a cycle. A vertex v is a sink with respect to a loop Γ if the two edges adjacent to v in Γ are directed into v. Every loop must contain at least one vertex that isn’t a sink with respect to that loop. sink

6 The Loop Cutset Problem Each vertex that isn’t a sink with respect to a loop Γ is called an allowed vertex with respect to Γ. A loop cutset of a directed graph D is a set of vertices that contain at least one allowed vertex with respect to each loop in D. A minimum loop cutset of a weighted directed graph D is the one for which the weight is minimum.

7 The Loop Cutset Problem Example –L is a sink with respect to the loop ACILJDA –L is an allowed vertex with respect to the loop JLMJ –{A,B,L}, {C,J} are valid loop cutsets of the graph –Suppose equal size of variables’ domains (=r) The number of instances associated with {C,J} is r 2 The number of instances associated with {A,B,L} is r 3 {C,J} is of lower weight than {A,B,L} A L C I D J B E K M

8 Weighted Vertex Feedback Set (WVFS) Let G(V,E) be an undirected graph. Let w: V R + be a weight function on the vertices of G. A vertex feedback set of G is a subset of vertices F in V such that each cycle in G passes through at least one vertex in F. A weight of a set of vertices X is w(X)=Σ vЄX w(v). A minimum vertex feedback set of weighted graph G is vertex feedback set F * with the minimum weight. WVFS problem is finding minimum vertex feedback set of a given weighted graph G having a weight function w.

9 Reduction from LC to WVFS Given a weighted directed graph (D,w), splitting weighted undirected graph D s with a weight function w s is constructed – Split each vertex v in D into two vertices v in and v out in D s, connect v in and v out. –All incoming edges to v become undirected incident edges with v in. –All outcoming edges from v become undirected incident edges with v out. –Set w s (v in )=∞ and w s (v out )=w(v). V out VV in W(v) ∞ VV in ∞ V out W(v) Γ2Γ2 Γ1Γ1 C1C1 C2C2

10 Algorithm LC Ψ(X) is a set obtained by replacing each vertex v in or v out in X by the respective source vertex v in D Algorithm LC –Input: A Bayesian network D. –Output: A loop cutset of D. 1.Construct the splitting graph D s with weight function w s. 2.Apply MGA on (D s, w s ) to obtain a vertex feedback set F. 3.Output Ψ(F). One-to-one and onto correspondence between loops in D and cycles in D s –MGA 2-approximation for WVFS yields LC 2- approximation!

11 Algorithm GA (Greedy Algorithm) Input: A weighted undirected graph G(V,E,w) Output: A vertex feedback set F. –F Ø ; i 0; –Repeatedly remove all vertices with degree 0 or 1 from V and their adjacent edges from E and insert the resulting graph into G 1. –While G i isn ’ t the empty graph, do: 1.Pick a vertex v i for which w(v i )/d(v i ) is minimum in G i 2.F F U {v i } 3.V V \{v i } 4.i i + 1 5.Repeatedly remove all vertices with degree 0 or 1 from V and their adjacent edges from E and insert the resulting graph into G i.

12 Algorithm MGA (Modified GA) –F ' Ø ; i 0; –Repeatedly remove all vertices with degree 0 or 1 from V and their adjacent edges from E and insert the resulting graph into G 1. –While G i isn ’ t the empty graph, do: 1.Pick a vertex v i for which w(v i )/d(v i ) is minimum in G i 2.F ’ F ’ U {v i } 3.V V \{v i } 4.i i + 1 5.Repeatedly remove all vertices with degree 0 or 1 from V and their adjacent edges from E and insert the resulting graph into G i. For every edge e=(u 1,u 2 ) removed in this process do: –C(e) w(v i )/d(v i ); –w(u 1 ) w(u 1 ) – C(e); –w(u 2 ) w(u 2 ) – C(e);

13 MGA (Phase 2) Phase 2 –F F ‘ –For i = |F ’| to 1 do If every cycle in G that intersects with {v i } also intersects with F \{v i } then, –F F \{v i } After applying phase 2 on the vertex feedback set, all redundant vertices are removed and we get a minimal vertex feedback set Before applying phase 2 –4-approximation After applying phase 2 –2-approximation

14 MGA - Motivation In each iteration remove all the isolated nodes and all the chains –Isolated nodes and chains don’t break singly- connectedness Pick a node with the minimum cost w(v i )/d(v i ) to vertex feedback set –Small w(v i ) means, it ’ s worthy taking node v i into the vertex feedback set since it has a low weight –Large d(v i ) means, it ’ s worthy taking node v i into the vertex feedback set since it might break a high number of loops (has a high degree), even if it ’ s weight w(v i ) is large

15 MGA - Motivation Subtract the effective cost (cost paid per edge) from the neighboring nodes weights. Example: v2v2 v4v4 v5v5 v3v3 v1v1 w 1 =1 c=0.5 v2v2 v4v4 v5v5 v3v3 v1v1 C(e)=0.5 5 v2v2 v4v4 v5v5 v3v3 v1v1 C(e)=1.25 G1G1 G2G2 G3G3 w 1 =3 c=1 w 1 =10 c=5 w 1 =10 c=5 w 1 =10 c=10/3 w 2 =9.5 c=9.5/2 w 2 =2.5 c=2.5/2 w 2 =10 c=5 w 2 =10 c=5 F ‘ ={v 1 }F ‘ ={v 1,v 2 }F=F ’ C(e)=0.5 C(e)=1.25 F ’ =Φ

16 Performance and Summary Theoretical approximation ratio of MGA is 2 (the worst case) Average approximation ratio of MGA on randomly generated graphs is 1.22 Before running the computations of conditioning method, the optimal complexity of its running time can be computed with high precision using MGA

17 Extra slides – MGA Analysis and approximation ratio proof

18 MGA - Analysis F * - a minimum vertex feedback set of G(V,E,w). Vertices in F ‘ are {v 1,v 2,…,v t } when v i are indexed in the order they inserted into F ’. w i (v) and d i (v) are weight and degree respectively of vertex v in G i. V i is a set of vertices of G i. Let Theorem 6 (proved later):

19 MGA - Analysis Let Γ 1 (v) be the set of edges in G 1 for which at least one endpoint is v. From the description of the algorithm for every vertex v in G 1,, if vЄF

20 Theorem 3 Theorem 3: Algorithm MGA always outputs a vertex feedback set F whose weight is no more than twice the weight of a minimum vertex feedback set F*. Proof: –Let i=j+1, then:

21 Theorem 3 (Proof) –By grouping edges according to iterations they are assigned a weight: –Let –By definition:

22 Theorem 3 (Proof) Theorem 6: Grouping edges by iterations they are assigned a weight

23 Theorem 6 Definitions: –Let –Let d X (v) be the number of edges whose one endpoint is v and the other is a vertex in X. Theorem 6: Let G be a weighted graph for which every vertex has a degree strictly greater than 1, F be a minimal vertex feedback set of G and F * be an arbitrary vertex feedback set of G (possibly a minimum weight one), then: v X d X (v)=3

24 Theorem 6 (Proof) To prove the theorem l.h.s. is divided to 2 terms and an upper bound for each term is provided. Lemma 7: Let G,F and F * be defined as above. Then, Proof: –For every set of vertices B in G –Since d(v) ≥ 2 for each v in G F BF*F* ≥0

25 Theorem 6 (Lemma 7) –Since d B (v) ≤ d(v), –We have to prove that the following hold for some set of vertices B F BF*F*

26 Lemma 7 –Let’s define a set B for which this inequality can be proven. –F is minimal Each vertex in F can be associated with a cycle in G that contains no other vertices of F. We define a graph H that consists of the union of these cycles (one cycle per each vertex). Definition: A linkpoint is a vertex with degree 2. A branchpoint is a vertex with degree larger than 2. Every vertex in F is a linkpoint in H. Let B the vertices of H. linkpointbranchpoint F BF*F*

27 Lemma 7 –The proof is constructive We apply the following procedure on H and showing that there are terms on the r.h.s. that contribute 2 for each vєF, and weren ’ t used for other vєF. –Pick a vertex vєF and follow the two paths p 1 and p 2 in H from it until the first branchpoint on each path is found. –There are 3 cases to consider: Case 1: –From definition of H: –d B (b 1 )-2 ≥ 1, d B (b 2 )-2 ≥ 1 –There are terms to contribute 2 to r.h.s. v p1p1 b1b1 b2b2 p2p2

28 Lemma 7 Case 2: –If d B (b 1 )≥ 4 »d B (b 1 )-2 ≥ 2 –If d B (b 1 )= 3 »d B (b 1 )-2 = 1 »d B (b 2 )-2 ≥ 1 –There are terms to contribute 2 to r.h.s. Case 3: –Isolated cycle –There exists a vertex in F *, that resides on no other cycle of H. »There are maximum such cases –There are terms to contribute 2 to r.h.s. p3p3 b2b2 vb1b1 p2p2 p1p1 v p1p1 b1b1 p2p2 v p1p1 b1b1 p2p2

29 Lemma 7 –Remove the paths p 1 and p 2 from H obtaining a graph in which still each vertex in F resides on a cycle that contains no other vertices of F. –Continue till F is exhausted.

30 Lemma 8: Let G,F and F * be defined as above. Then, Proof: Note that Lemma 7 F*F* F v2v2 v4v4 v5v5 v3v3 v1v1 (2-2)+(3-2)=1 =1=-1=1

31 Lemma 8 So if we prove, then we prove the lemma The graph induced by is a forest, and since the number of edges in a forest is smaller than the number of vertices, the following holds

32 To complete the proof, we use bounds obtained from lemmas 7 and 8: Theorem 6 (Proof)


Download ppt "Optimization of Pearl’s Method of Conditioning and Greedy-Like Approximation Algorithm for the Vertex Feedback Set Problem Authors: Ann Becker and Dan."

Similar presentations


Ads by Google