1 / 14

Bayes Nets

Bayes Nets. Bayes Nets Quick Intro. Topic of much current research Models dependence/independence in probability distributions Graph based - aka “graphical models” “belief nets” Two kinds - directed and undirected. Basic idea:.

helena
Télécharger la présentation

Bayes Nets

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayes Nets

  2. Bayes Nets Quick Intro • Topic of much current research • Models dependence/independence in probability distributions • Graph based - aka “graphical models” “belief nets” • Two kinds - directed and undirected

  3. Basic idea: • Model a joint distribution as a DAG with attributes (features) at the nodes • Sample distribution from sources Party Apti- tude study Sleep Pass Exam

  4. At each node is a conditional distribution for the attribute depending only on incoming arcs P(yes) = .6 P(no) = .4 Party yes P(yes) = 0.1 P(no)=0.9 Party No P(yes)=0.9 P(no) = 0.1) Party study

  5. Causes and Bayes’ Rule Diagnostic inference: Knowing that the grass is wet, what is the probability that rain is the cause? diagnostic causal

  6. Causal vs Diagnostic Inference Causal inference:If the sprinkler is on, what is the probability that the grass is wet? P(W|S) = P(W|R,S) P(R|S) + P(W|~R,S) P(~R|S) = P(W|R,S) P(R) + P(W|~R,S) P(~R) = 0.95 0.4 + 0.9 0.6 = 0.92 Diagnostic inference: If the grass is wet, what is the probability that the sprinkler is on? P(S|W) = 0.35 > 0.2 P(S) P(S|R,W) = 0.21 Explaining away: Knowing that it has rained decreases the probability that the sprinkler is on.

  7. Bayesian Networks: Causes Causal inference: P(W|C) = P(W|R,S) P(R,S|C) + P(W|~R,S) P(~R,S|C) + P(W|R,~S) P(R,~S|C) + P(W|~R,~S) P(~R,~S|C) and use the fact that P(R,S|C) = P(R|C) P(S|C) Diagnostic: P(C|W ) = ?

  8. Bayesian Nets: Local structure P (F | C) = ?

  9. Bayesian Networks: Inference P (C,S,R,W,F) = P (C) P (S|C) P (R|C) P (W|R,S) P (F|R) P (C,F) = ∑S∑R∑W P (C,S,R,W,F) P (F|C) = P (C,F) / P(C) Not efficient! Belief propagation (Pearl, 1988) Junction trees (Lauritzen and Spiegelhalter, 1988)

  10. Conditional independence • Know values of some nodes, which others are independent? • Nodes depend on ancestors through parents • Node conditionally indep of rest of graph if its parents, children, and children’s parents known • Notion of d-path - two nodes dependent if there is a d-path (or coincidence), independent otherwise

  11. D-paths go through • Linear non-evidence nodes (both ways) • Diverging non-evid. Nodes • converging evidence nodes: • Conveging non-evidence w. evidence descd.

  12. In general, working with Bayes nets hard, (easy for trees, also approximate inference - loopy propagation, sampling) • Concise representation of some distributions • Very flexible - can fix or learn structure, can have unobserved nodes (latent variables) • Can capture conditional dependencies • Generalizes HMMs and other models

  13. Bayes nets problems:

  14. Bayes Nets Applications • Add probabilities to expert systems • Vista system (Horvitz) for space shuttle problem detection/remedies • Used in Microsoft products: Office ‘95 Answer Wizard, Office ‘97 Office Assistant, etc. • Computational Biology • Special cases applied to speech recognition, turbo coding, and many other places. Next: dec. trees

More Related