1 / 10

Exploring Multiply-Connected Graphs with the Junction Tree Algorithm in Bayesian Networks

This lecture focuses on the challenges presented by multiply-connected graphs in Bayesian Networks (BNs). When there are multiple paths between nodes, answering queries becomes complex. We discuss three primary strategies to handle these challenges: clustering offending nodes into meganodes, conditioning on variables, and stochastic simulation. The popular approach of clustering is explored, highlighting the Junction Tree Algorithm, including moralization of graphs, triangulation, clique formation, and message-passing techniques to simplify computations and enable efficient querying.

amma
Télécharger la présentation

Exploring Multiply-Connected Graphs with the Junction Tree Algorithm in Bayesian Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSCI 121 Special Topics: Bayesian NetworksLecture #3: Multiply-Connected Graphs and the Junction Tree Algorithm

  2. A A B C B C D D Answering Queries: Problems Difficult if graph is not singly connected (a.k.a polytree): Multiply connnected: more than one path from A to D. P(D|A) = ? Singly connnected (polytree): just one path from A to D.

  3. Dealing with Multiply Connected Graphs • Three basic options: • Clustering – group “offending” nodes into “meganodes” • Conditioning – set variables to definite values; then build a polytree for each combo • Stochasticsimulation – Generate a large number of concrete models consistent with the domain. • Clustering seems to be the most popular .

  4. A A C C G G B B D D E H E H F F Clustering with the Junction-Tree Algorithm (Huang & Darwiche 1994) 0) Note that a BN is a directed acyclic graph (DAG) 1) “Moralize” the DAG: For parents A, B of node C, draw an edge between A and B. Then remove arrows (undirected graph).

  5. A A C G C G B B D E H D E H F F Clustering with the Junction-Tree Algorithm 2) Triangulate the moral graph so that every cycle of length ≥ 4 contains an edge between nonadjacent nodes. Use a heuristic based on minimal # of edges added, minimal # possible values.

  6. A ABD ACE C G B ADE CEG D E H DEF EGH F Clustering with the Junction-Tree Algorithm 3) Build cliques from triangulated graph: Put your hands in the air and represent your clique! – 112 , “Peaches and Cream”

  7. ABD ADE ACE CEG AD AE CE EG DE DEF EGH Clustering with the Junction-Tree Algorithm 4) Connect cliques by separation sets to form the junction tree:

  8. Marginalization • At this point, each cluster (clique; meganode) has a joint probability table. • To query a variable, we (heuristically) pick a cluster containing it, and marginalizeover the joint probability Ф from the table: ABDФABD T T T .225 T T F .025 T F T .125 T F F .125 F T T .180 F T F .020 F F T .150 F F F .150 D P(D) Σ T .225 + .125 + .180 + .150 = .680 F .025 + .125 + .020 + .150 = .320

  9. ABD ADE ACE CEG AD AE CE EG DE DEF EGH Message-Passing • Sepset potentials are initialized via marginalization. • When evidence is presented (“John calls”), heuristically pick a “root clique” and pass messages around the tree:

  10. Message-Passing • Messages are passed from clique X to Y through sepset R by multiplication and division of table entries. • Evidence is set by “masking” table entries with a bit vector (or probability distribution). E.g., observe B = F: ABDФABD T T T 0 T T F 0 T F T .125 T F F .125 F T T 0 F T F 0 F F T .150 F F F .150

More Related