1 / 38

Belief networks

Conditional independence Syntax and semantics Exact inference Approximate inference. Belief networks. Mundhenk and Itti 2008. Based on material from S Russel and P Norvig. Independence. Conditional independence.

rune
Télécharger la présentation

Belief networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Conditional independence • Syntax and semantics • Exact inference • Approximate inference Belief networks Mundhenk and Itti 2008. Based on material from S Russel and P Norvig CS 460, Belief Networks

  2. Independence CS 460, Belief Networks

  3. Conditional independence CS 460, Belief Networks

  4. Other interactions may exist, but they are either insignificant, unknown or irrelevant. We leave them out. Cavity Toothache Catch Conditional Independence CS 460, Belief Networks

  5. We assume that a “catch” is not influenced by a toothache and visa versa. Cavity Toothache Catch Conditional Independence CS 460, Belief Networks

  6. Conditional independence CS 460, Belief Networks

  7. (1a) Since Catch does not affect Toothache P(Toothache|Catch,Cavity) = P(Toothache|Cavity) Cavity Catch Cavity = Toothache Toothache Conditional Independence CS 460, Belief Networks

  8. (1b) Algebraically these statements are equivalent P(Toothache,Catch|Cavity) = P(Toothache|Cavity)P(Catch|Cavity) Cavity Cavity Cavity = Toothache Catch Toothache Catch Conditional Independence CS 460, Belief Networks

  9. Conditional independence CS 460, Belief Networks

  10. Belief networks CS 460, Belief Networks

  11. Probabilities derived from prior observations Example CS 460, Belief Networks

  12. Here we see both the Topology and the Conditional Probability Tables (CPT). Burglary Earthquake Alarm JohnCalls MaryCalls Typical Bayesian Network CS 460, Belief Networks

  13. Semantics CS 460, Belief Networks

  14. Semantics CS 460, Belief Networks

  15. Markov blanket CS 460, Belief Networks

  16. Constructing belief networks CS 460, Belief Networks

  17. What is the probability that the alarm has sounded but neither a burglary nor earthquake has occurred and both John and Mary call? – Burglary Earthquake Alarm JohnCalls MaryCalls Example: Full Joint Distribution CS 460, Belief Networks

  18. Burglary Earthquake Alarm JohnCalls MaryCalls CS 460, Belief Networks

  19. We find that storms can also set off alarms. We add that into our CPT. Notice that JohnCalls and MaryCalls stay the same since Storms were always there but were just unaccounted for. John and Mary did not change! However, we have better precision at P(A). Burglary Earthquake Storm Alarm What if we find a new variable? JohnCalls MaryCalls CS 460, Belief Networks

  20. What if we inject a new cause that was not there before. We pay a crazy guy to set off the alarm frequently, JohnCalls and MaryCalls may no longer be valid since we may have changed the behaviors. For instance the alarm goes off so often now that John and Mary are more likely to ignore it. Burglary Earthquake CrazyGuy Alarm What if we cause a new variable? (or a new variable just occurs) JohnCalls MaryCalls CS 460, Belief Networks

  21. If the introduced variable is highly erratic, it can invalidate even more of the CPT than we would like. Burglary Earthquake CrazyGuy Alarm What if we cause a new variable? JohnCalls MaryCalls CS 460, Belief Networks

  22. However some changes to the CPT may be absurd so we may never have to worry about them. Burglary Earthquake CrazyGuy Alarm What if we cause a new variable? JohnCalls MaryCalls CS 460, Belief Networks

  23. We can account for change in the model over time (a more advanced topic). • John and Mary may be more or less likely to call at certain times. Cyclical repetition may be not too difficult to model. • People may become tired of their job and be less likely call over longer periods. This may be easy or difficult to model. • Crime picks up. If the trend is slow enough, the model may be able to adjust online even if we have never observed crime picking up before. However, this may easily and totally throw our model off. • However, keep in mind, we may get good enough results for our model even without accounting for changes over time. Things in the model can change CS 460, Belief Networks

  24. The CPT can describe other events and probabilities such as action success given observations. Tree in Path Rock in Path Obstacle Detect Execute Stop Execute Turn How would we apply this to Robotics? CS 460, Belief Networks

  25. What if we want to make an inference such as: what is the probability of a tree in the path given that the robot has stopped and turned. This might be useful to a robot which can judge if there is a tree in a path based on the behavior of another robot. So if robot A sees robot B turn or stop it might infer that there is a tree in the path. Normalized we get: Exact Inference in a Bayesian Network CS 460, Belief Networks

  26. We compute P of tree and P of not tree and normalize. This is essentially an enumeration of all situations for tree and not tree. Bayesian Inference cont’ CS 460, Belief Networks

  27. Finishing: 0.06 0.07497 The P of a tree in the path is 0.4445 Bayesian Inference cont’ CS 460, Belief Networks

  28. What about hidden variables? CS 460, Belief Networks

  29. Example: car diagnosis CS 460, Belief Networks

  30. Example: car insurance CS 460, Belief Networks

  31. Compact conditional distributions CS 460, Belief Networks

  32. Know Infer Compact conditional distributions CS 460, Belief Networks

  33. If subsidy was a hidden variable would it be discrete and discreet? Hybrid (discrete+continuous) networks CS 460, Belief Networks

  34. Continuous child variables CS 460, Belief Networks

  35. Continuous child variables CS 460, Belief Networks

  36. Discrete variable w/ continuous parents CS 460, Belief Networks

  37. Discrete variable CS 460, Belief Networks

  38. Exact inference by enumeration • Exact inference by variable elimination • Approximate inference by stochastic simulation • Approximate inference by Markov chain Monte Carlo (MCMC) Inference in belief networks CS 460, Belief Networks

More Related