1 / 10

A Hybrid Algorithm to Compute Marginal and Joint Beliefs in Bayesian Networks and its complexity

Mark Bloemeke Artificial Intelligence Laboratory University of South Carolina. Marco Valtorta Artificial Intelligence Laboratory University of South Carolina. A Hybrid Algorithm to Compute Marginal and Joint Beliefs in Bayesian Networks and its complexity. Presentation by. Instructor.

liam
Télécharger la présentation

A Hybrid Algorithm to Compute Marginal and Joint Beliefs in Bayesian Networks and its complexity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mark Bloemeke Artificial Intelligence Laboratory University of South Carolina Marco Valtorta Artificial Intelligence Laboratory University of South Carolina A Hybrid Algorithm to Compute Marginal and Joint Beliefs in Bayesian Networks and its complexity Presentation by Instructor Sreeja Vallabhan Marco Valtorta Marginal and Joint Beliefs in BN

  2. Abstract Methods (algorithms) to Update Probability in Bayesian Network • Using a structure (Clique Tree) and perform local message based calculation to extract the belief in each variable. • Using Non Serial Dynamic Programming Techniques to extract the belief in some desired group of variables.

  3. Goal Present a hybrid algorithm based on Non Serial Dynamic Programming Techniques and possessing the ability to retrieve the belief in all single variables.

  4. Symbolic Probabilistic Inference (SPI) Consider the Bayesian Network with DAG G = (V, E) and conditional probability tables where are the parents of vi in G. Total joint probability using Chain Rule of Bayesian Network (1) Using marginalization to retrieve belief in any subset of variables V’ as (2) SPI is based on these two equations

  5. Symbolic Probabilistic Inference (SPI) In SPI, to maintain control over the size and time complexity of the resulting tables: • Variables are ordered before calculations • Summations are pushed down into products

  6. Symbolic Probabilistic Inference (SPI) Consider the following Bayesian Network Assuming that each variable has two states, P(A,C) require a total of 92 significant operations From equation (1) and (2), the joint probability of the variable A and C

  7. Symbolic Probabilistic Inference (SPI) With a single re-ordering of the terms combined by equation (1) followed by the distribution of the summation from (2): This requires only 32 significant operations.

  8. Factor Trees Two Stage method for deriving the desired joint and single beliefs. • Creation of Factor Tree. • Passing algorithm on the Factor Tree to retrieve desired joint and single beliefs.

  9. Factor Trees Algorithm • Start by Calculating the optimal factoring order for the network given the target set of variables whose joint is desired. • Construct a Binary Tree showing the combination of initial probability table and conformal table. • Label edges between table along which variables are marginalized with the variables marginalized before combination. • Add an additional head that has an empty label above the current root, a conformal table labeled with the target set of variables, that has no variables.

  10. Factor Trees Algorithm

More Related