1 / 22

Fall 2009 Marco Valtorta mgv@cse.sc

CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Sections 6.2 and 6.3: Independence and Belief (Bayesian) Networks. Fall 2009 Marco Valtorta mgv@cse.sc.edu.

pschubert
Télécharger la présentation

Fall 2009 Marco Valtorta mgv@cse.sc

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSCE 580Artificial IntelligenceCh.6 [P]: Reasoning Under UncertaintySections 6.2 and 6.3: Independence and Belief (Bayesian) Networks Fall 2009 Marco Valtorta mgv@cse.sc.edu It is remarkable that a science which began with the consideration of games of chance should become the most important object of human knowledge... The most important questions of life are, for the most part, really only problems of probability. . . The theory of probabilities is at bottom nothing but common sense reduced to calculus. --Pierre Simon de Laplace, 1812 Probability does not exist. --Bruno de Finetti, 1970

  2. Acknowledgment • The slides are based on the textbook [P] and other sources, including other fine textbooks • [AIMA-2] • David Poole, Alan Mackworth, and Randy Goebel. Computational Intelligence: A Logical Approach. Oxford, 1998 • A second edition (by Poole and Mackworth) is under development. Dr. Poole allowed us to use a draft of it in this course • Ivan Bratko. Prolog Programming for Artificial Intelligence, Third Edition. Addison-Wesley, 2001 • The fourth edition is under development • George F. Luger. Artificial Intelligence: Structures and Strategies for Complex Problem Solving, Sixth Edition. Addison-Welsey, 2009

  3. Conditional Independence

  4. Conditional Independence Is Symmetric

  5. Example domain (diagnostic assistant)

  6. Examples of conditional independence • The identity of the queen of Canada is independent of whether light l1 is lit given whether there is outside power. • Whether there is someone in a room is independent of whether a light l2 is lit given the position of switch s3. • Whether light l1 is lit is independent of the position of light switch s2 given whether there is power in wire w0. • Every other variable may be independent of whether light l1 is lit given whether there is power in wire w0 and the status of light l1 (if it's ok, or if not, how it's broken). • Conditional independence is defined using numbers, but it can often be established by qualitative arguments.

  7. Idea of belief (Bayesian) networks

  8. Bayesian networks • The method of strata for constructing a Bayesian network is given above • Usually, a Bayesian network is defined to also include probabilities, as in the following slide.

  9. Components of a Bayesian network • A belief network consists of: • a directed acyclic graph with nodes labeled with random variables • a domain for each random variable • a set of conditional probability tables for each variable given its parents (including prior probabilities for nodes with no parents).

  10. Example BN (6.10) • Suppose we want to use the diagnostic assistant to diagnose whether there is a fire in a building based on noisy sensor information and possibly conflicting explanations of what could be going on. The agent receives a report about whether everyone is leaving the building. Suppose the report sensor is noisy: It sometimes reports leaving when there is no exodus, a false positive, and sometimes does not report when everyone is leaving, a false negative. Suppose the fire alarm going off can cause the leaving, but this is not deterministic relationship. Either tampering or fire could affect the alarm. Fire also causes smoke to rise from the building.

  11. Example BN (6.11) Let’s select an ordering where the causes of a variable are before the variable in the ordering. For example, the variable for whether a light is lit comes after variables for whether the light is working and whether there is power coming into the light.

  12. Independence Assumptions in BNs

  13. Example for D-Separation

  14. D-separation: converging connections

  15. D-Separation: Diverging Connections

  16. D-Separation: Serial Connections

  17. Using BNs: Diagnostic Assistant The power network can be used by the diagnostic assistant in a number of ways: • Conditioning on the status of the switches and circuit breakers, whether there is outside power and the position of the switches, you can simulate the lighting. • Given values for the switches, the outside power, and whether the lights are lit, you can determine the posterior probability that each switch or circuit breaker is ok or not. • Given some switch positions and some outputs and some intermediate values, you can determine the probability of any other variable in the network.

  18. Updating Probabilities by Conditioning

  19. Some features and properties of BNs • A belief network is automatically acyclic by construction. • A belief network is a directed acyclic graph (DAG) where nodes are random variables. • The parents of a node n are those variables on which n directly depends. • A belief network is a graphical representation of dependence and independence: • A variable is independent of its non-descendants given its parents.

  20. Constructing BNs To represent a domain in a belief network, you need to consider: • What are the relevant variables? • What will you observe? • What would you like to find out (query)? • What other features make the model simpler? • What values should these variables take? • What is the relationship between them? This should be expressed in terms of local influence. • How does the value of each variable depend on its parents? This is expressed in terms of the conditional probabilities.

  21. Example 5.30 and 6.14 The process of diagnosis is carried out by conditioning on the observed symptoms and deriving posterior probabilities of the faults or diseases.

  22. Help System Example A Naïve Bayesian Classifier: P(H|F1,F2,…,Fn) = K x P(H) x P(F1,F2,…,Fn) = = K x P(H) x P(F1|H) x P(F2|H) x … x P(Fn | H)

More Related