1 / 18

BAYESIAN NETWORKS CHAPTER#4

BAYESIAN NETWORKS CHAPTER#4. Book: Modeling and Reasoning with Bayesian Networks Author : Adnan Darwiche Publisher: CambridgeUniversity Press 2009. Introduction.

Télécharger la présentation

BAYESIAN NETWORKS CHAPTER#4

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. BAYESIAN NETWORKS CHAPTER#4 Book: Modeling and Reasoning with Bayesian Networks Author : AdnanDarwiche Publisher: CambridgeUniversity Press 2009

  2. Introduction • Joint Probability Distribution can be used to model uncertain beliefs and change them in the face of Hard and Soft Evidence. • Problem with JPD is that size grows exponentially with the number of variables which introduces modeling and computational difficulties.

  3. Need for BN • BN is a graphical modeling tool for compactly specifying JPD • BN relies on the basic insight that: “ independence forms a significant aspect of belief” “Elicitation is relatively easily using the language of graph”

  4. Example • BN is a Directed Acyclic Graph Earthquake (E) Burglary (B) Nodes are Propositional Variables Edges are Direct Causal Influences Radio (R) Alarm (A) Call (C)

  5. Example • We would expect our belief in C to be influenced by some Evidence on R • For example if we get a Radio report that an Earthquake took place then our belief in Alarm triggering would increase which would increase our belief in receiving call from a neighbor • However we would not change our belief if we knew for sure that the Alarm did not trigger • Thus C would be independent of R given ¬A

  6. Formal Representation of Independence Given a variable V in a DAG G: • Parents (V) are the parents of V [Direct Causes of V] • Descendants(V) are the set of variables N with a directed path from V to N [Effects of V] • Non_Descendants(V) are the variables other that Parents and Descendants

  7. Independence Statement / Markovian Assumption I ( V, Parents (V), Non_Descendants(V)) ….. 4.1 • That is every variable is conditionally independent of its Non Descendants given its parents known as Markovian Assumption denoted by Markov(G) 4.1 can also be read as: • Given the direct causes of a variable, our beliefs in that variable will no longer be influenced by any other variable except possibly by its effects

  8. Examples of Independence Statements • I (C,A, {B,E,R} ) • I (R,E, {A,B,C} ) • I (A,{B,E}, R) • I (B, ø , {E,R}) • I (E, ø , B) Earthquake (E) Burglary (B) Radio (R) Alarm (A) Call (C)

  9. Parameterizing the Independence Structure • Parameterizing means quantifying the dependencies between Nodes and their Parents • In other words construction of CPT • For every variable X in the DAG G and its parents U, we need to provide the probability Pr(x|u) for every value x of variable X and every instantiation u of parents U

  10. Formal Definition of Bayesian Network • A Bayesian Network for variables Z is a pair where: • G is a directed acyclic graph over variables Z called the Network Structure • is a set of CPT’s one for each variable in Z called the Network Parameterization • (X|U) would be used to denote the CPT for variable X and its parents U, and refer to the set XU as a Network Family.

  11. Def (continue..) • denotes the value assigned by CPT to the conditional probability Pr (x|u) and call it Network Parameter • Instantiation of all the network variables are called Network Instantiations

  12. Chain Rule for Bayesian Networks • Network Instantiations z is simply the product of all network parameters compatible with z

  13. Properties of Probabilistic Independence • Recall : I (X,Z,Y) Pr(x|z,y) = Pr(x|z) or Pr(y|z) =0 for all instantiations x,y,z • Graphoid Axioms: • Symmetry • Weak Union • Decomposition • Contraction

  14. Symmetry IPr(X,Z,Y) if and only if IPr(Y,Z,X) • If learning Y does not influence our belief in x then learning x does not influence our belief in y • By Markov(G) we know that: • I (A,{B,E},R) • Using Symmetry: • I (R,{B,E},A) Earthquake (E) Burglary (B) Alarm (A) Radio (R) Call (C)

  15. Decomposition IPr(X,Z,YUW) only if IPr(X,Z,Y) and IPr(X,Z,W) • If learningyw does not influence our belief in x then learning y alone or learning w alone does not influence our belief in x

  16. Weak Union IPr(X,Z,YUW) only if IPr(X,ZUY,W) • If the informationyw is not relevant to our belief in x then the partial information will not make the rest of the information relevant

  17. Contraction IPr(X,Z,Y) and I (X,ZUY,W) only if IPr(X,Z,YUW) • If learning the irrelevant information y the information w is found to be irrelevant to our belief in x then the combined information must have been irrelevant from the beginning

  18. Questions ???

More Related