1 / 32

Authors : Ann Becker and Dan Geiger Presented by : Igor Kviatkovsky

Optimization of Pearl’s Method of Conditioning and Greedy-Like Approximation Algorithm for the Vertex Feedback Set Problem. Authors : Ann Becker and Dan Geiger Presented by : Igor Kviatkovsky. Outline. Introduction The Loop Cutset (LC) problem Weighted Vertex Feedback Set (WVFS) problem

kata
Télécharger la présentation

Authors : Ann Becker and Dan Geiger Presented by : Igor Kviatkovsky

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Optimization of Pearl’s Method of Conditioning and Greedy-Like Approximation Algorithm for the Vertex Feedback Set Problem Authors: Ann Becker and Dan Geiger Presented by: Igor Kviatkovsky

  2. Outline • Introduction • The Loop Cutset (LC) problem • Weighted Vertex Feedback Set (WVFS) problem • Reduction from LC to WVFS • MGA (Modified Greedy Algorithm) • 2-approximation for WVFS

  3. a b b a A B D D E C C E J J L M L M K I I K Introduction • Pearl’s method of conditioning is one of the known inference methods for Bayesian networks • Find a set of vertices such that once the corresponding variables are instantiated, the remaining network is singly–connected • Pearl’s UPDATE-TREE procedure can be applied Fixing value of A & B

  4. C E I K Introduction A B • Fixing value of A & B & L breaks all loops. But can we choose less variables to break all loops? • Are there better variables to choose than others? • Motivation(Geiger&Becker): • choose the vertices which break more loops • vertices with higher degree are more likely to break more loops (vertice J with degree 3 breaks 3 loops!) D J L M

  5. The Loop Cutset Problem • Definitions: • The underlying graph G of a directed graph D is the undirected graph formed by ignoring the directions of the edges in D. • A loop in D is a subgraph of D whose underlying graph is a cycle. • A vertex v is a sink with respect to a loop Γif the two edges adjacent to v in Γare directed into v . • Every loop must contain at least one vertex that isn’t a sink with respect to that loop. sink

  6. The Loop Cutset Problem • Each vertex that isn’t a sink with respect to a loop Γis called an allowed vertex with respect to Γ. • A loopcutset of a directed graph D is a set of vertices that contain at least one allowed vertex with respect to each loop in D. • A minimumloopcutset of a weighted directed graph D is the one for which the weight is minimum.

  7. C E I K The Loop Cutset Problem A B • Example • L is a sink with respect to the loop ACILJDA • L is an allowed vertex with respect to the loop JLMJ • {A,B,L}, {C,J} are valid loop cutsets of the graph • Suppose equal size of variables’ domains (=r) • The number of instances associated with {C,J} is r2 • The number of instances associated with {A,B,L} is r3 • {C,J} is of lower weight than {A,B,L} D J L M

  8. Weighted Vertex Feedback Set (WVFS) • Let G(V,E) be an undirected graph. • Let w: V R+be a weight function on the vertices of G. • Avertex feedback setof G is a subset of vertices F in V such that each cycle in G passes through at least one vertex in F. • A weight of a set of vertices X is w(X)=ΣvЄX w(v). • Aminimum vertex feedback setof weighted graph G is vertex feedback set F* with the minimum weight. • WVFS problem is finding minimum vertex feedback set of a given weighted graph G having a weight function w.

  9. Reduction from LC to WVFS • Given a weighted directed graph (D,w), splitting weighted undirected graph Ds with a weight function ws is constructed • Split each vertex v in D into two vertices vinand voutin Ds , connect vinand vout . • All incoming edges to v become undirected incident edges with vin . • All outcoming edges from v become undirected incident edges with vout . • Set ws(vin)=∞ and ws(vout)=w(v). C1 W(v) Γ1 W(v) ∞ V Vin Vout ∞ W(v) V Vin Vout Γ2 C2

  10. Algorithm LC • Ψ(X) is a set obtained by replacing each vertex vinor voutin X by the respective source vertex v in D • Algorithm LC • Input: A Bayesian network D. • Output: A loop cutset of D. • Construct the splitting graph Dswith weight function ws . • ApplyMGA on (Ds , ws) to obtain a vertex feedback set F. • Output Ψ(F). • One-to-one and onto correspondence between loops in D and cycles in Ds • MGA 2-approximation for WVFS yields LC 2-approximation!

  11. Algorithm GA (Greedy Algorithm) • Input: A weighted undirected graph G(V,E,w) • Output: A vertex feedback set F. • FØ; i 0; • Repeatedly remove all vertices with degree 0 or 1 from V and their adjacent edges from E and insert the resulting graph into G1 . • While Gi isn’t the empty graph, do: • Pick a vertex vi for which w(vi)/d(vi) is minimum in Gi • F F U {vi} • V V \{vi} • i i + 1 • Repeatedly remove all vertices with degree 0 or 1 from V and their adjacent edges from E and insert the resulting graph into Gi .

  12. Algorithm MGA (Modified GA) • F'Ø; i 0; • Repeatedly remove all vertices with degree 0 or 1 from V and their adjacent edges from E and insert the resulting graph into G1 . • While Gi isn’t the empty graph, do: • Pick a vertex vi for which w(vi)/d(vi) is minimum in Gi • F’ F’ U {vi} • V V \{vi} • i i + 1 • Repeatedly remove all vertices with degree 0 or 1 from V and their adjacent edges from E and insert the resulting graph into Gi . For every edge e=(u1,u2) removed in this process do: • C(e) w(vi)/d(vi); • w(u1) w(u1) – C(e); • w(u2) w(u2) – C(e);

  13. MGA (Phase 2) • Phase 2 • F F ‘ • For i = |F ’| to 1 do • If every cycle in G that intersects with {vi} also intersects with F \{vi} then, • F F \{vi} • After applying phase 2 on the vertex feedback set, all redundant vertices are removed and we get a minimal vertex feedback set • Before applying phase 2 • 4-approximation • After applying phase 2 • 2-approximation

  14. MGA - Motivation • In each iteration remove all the isolated nodes and all the chains • Isolated nodes and chains don’t break singly-connectedness • Pick a node with the minimum cost w(vi)/d(vi) to vertex feedback set • Small w(vi) means, it’s worthy taking node vi into the vertex feedback set since it has a low weight • Large d(vi) means, it’s worthy taking node vi into the vertex feedback set since it might break a high number of loops (has a high degree), even if it’s weight w(vi) is large

  15. MGA - Motivation • Subtract the effective cost (cost paid per edge) from the neighboring nodes weights. • Example: w2=2.5 c=2.5/2 w1=3 c=1 w1=10 c=5 w2=10 c=5 w1=1 c=0.5 C(e)=1.25 v2 v3 v2 v3 v2 v3 C(e)=0.5 C(e)=0.5 v1 G1 v1 G2 v1 G3 C(e)=1.25 C(e)=1.25 w1=10 c=5 w2=10 c=5 v5 v4 C(e)=0.5 v5 v4 C(e)=0.5 v5 v4 C(e)=1.25 w1=10 c=10/3 w2=9.5 c=9.5/2 5 F ’=Φ F ‘={v1} F ‘={v1,v2} F=F ’

  16. Performance and Summary • Theoretical approximation ratio of MGA is 2 (the worst case) • Average approximation ratio of MGA on randomly generated graphs is 1.22 • Before running the computations of conditioning method, the optimal complexity of its running time can be computed with high precision using MGA

  17. Extra slides – MGA Analysis and approximation ratio proof

  18. MGA - Analysis • F* - a minimum vertex feedback set of G(V,E,w). • Vertices in F‘ are {v1,v2,…,vt} when vi are indexed in the order they inserted into F ’. • wi(v) anddi(v) are weight and degree respectively of vertex v in Gi . • Viis a set of vertices of Gi . • Let • Theorem 6 (proved later):

  19. MGA - Analysis • Let Γ1(v) be the set of edges in G1for which at least one endpoint is v. • From the description of the algorithm for every vertex v in G1 , , if vЄF

  20. Theorem 3 • Theorem 3: Algorithm MGA always outputs a vertex feedback set F whose weight is no more than twice the weight of a minimum vertex feedback set F*. • Proof: • Let i=j+1, then:

  21. Theorem 3 (Proof) • By grouping edges according to iterations they are assigned a weight: • Let • By definition:

  22. Theorem 3 (Proof) Theorem 6: Grouping edges by iterations they are assigned a weight

  23. Theorem 6 • Definitions: • Let • Let dX(v) be the number of edges whose one endpoint is v and the other is a vertex in X. • Theorem 6:Let G be a weighted graph for which every vertex has a degree strictly greater than 1, F be a minimal vertex feedback set of G and F* be an arbitrary vertex feedback set of G (possibly a minimum weight one), then: dX(v)=3 v X

  24. Theorem 6 (Proof) • To prove the theorem l.h.s. is divided to 2 terms and an upper bound for each term is provided. • Lemma 7: Let G,F and F* be defined as above. Then, • Proof: • For every set of vertices B in G • Since d(v) ≥ 2 for each v in G ≥0

  25. Theorem 6 (Lemma 7) • Since dB(v) ≤ d(v), • We have to prove that the following hold for some set of vertices B

  26. Lemma 7 • Let’s define a set B for which this inequality can be proven. • F is minimal • Each vertex in F can be associated with a cycle in G that contains no other vertices of F. • We define a graph H that consists of the union of these cycles (one cycle per each vertex). • Definition: A linkpoint is a vertex with degree 2. A branchpoint is a vertex with degree larger than 2. • Every vertex in F is a linkpoint in H. • Let B the vertices of H. linkpoint branchpoint

  27. Lemma 7 • The proof is constructive • We apply the following procedure on H and showing that there are terms on the r.h.s. that contribute 2 for each vєF , and weren’t used for other vєF . • Pick a vertex vєF and follow the two paths p1and p2in H from it until the first branchpoint on each path is found. • There are 3 cases to consider: • Case 1: • From definition of H: • dB(b1)-2 ≥ 1, dB(b2)-2 ≥ 1 • There are terms to contribute 2 to r.h.s. p1 b1 v p2 b2

  28. Lemma 7 p1 • Case 2: • If dB(b1)≥ 4 • dB(b1)-2 ≥ 2 • If dB(b1)= 3 • dB(b1)-2 = 1 • dB(b2)-2 ≥ 1 • There are terms to contribute 2 to r.h.s. • Case 3: • Isolated cycle • There exists a vertex in F*, that resides on no other cycle of H. • There are maximum such cases • There are terms to contribute 2 to r.h.s. v b1 p2 p3 b2 p1 v b1 p2 p1 v b1 p2

  29. Lemma 7 • Remove the paths p1 and p2from H obtaining a graph in which still each vertex in F resides on a cycle that contains no other vertices of F. • Continue till F is exhausted.

  30. Lemma 7 • Lemma 8: Let G,F and F* be defined as above. Then, • Proof: Note that (2-2)+(3-2)=1 =1 =1 =-1 F* F v2 v3 v1 v5 v4

  31. Lemma 8 • So if we prove, then we prove the lemma • The graph induced by is a forest, and since the number of edges in a forest is smaller than the number of vertices, the following holds

  32. Theorem 6 (Proof) • To complete the proof , we use bounds obtained from lemmas 7 and 8:

More Related