1 / 39

CS3243: Introduction to Artificial Intelligence

This article explores the concepts of uncertainty and probability in artificial intelligence, covering topics such as syntax, semantics, inference, and independence. It discusses the challenges of handling uncertainty in AI systems and introduces Bayesian inference and Bayesian networks as solutions. The text is based on the CS3243 course on Introduction to Artificial Intelligence and the AIMA textbook.

jrosales
Télécharger la présentation

CS3243: Introduction to Artificial Intelligence

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS3243: Introduction to Artificial Intelligence Semester 2, 2017/2018

  2. Uncertainty AIMA Chapter 13

  3. Outline • Uncertainty • Probability • Syntax and Semantics • Inference • Independence and Bayes’ Rule

  4. Uncertainty: Motivating Example Let taxi agent’s action leave for airport minutes before flight. Will get me there on time? • Sources of uncertainty: • Partial observability (e.g., road state, other drivers’ plans, …) • Noisy sensors (e.g., traffic reports, fuel sensor, …) • Uncertainty in action outcomes (e.g., flat tire, accident, …) • Complexity in modeling and predicting traffic (e.g., congestion) • Logical agent either • risks falsehood: “ will get me there on time”, or • reaches weaker conclusion: “ will get me there on time if there's no accident on the bridge and it doesn't rain and my tires remain intact…”

  5. Random Variables

  6. Events • Given a random variable , let be its domain. • Atomic event (possible world): an assignment of a value to each random variable; a singleton event • We roll two different dice 6 6 6 6

  7. Events • Let the red die correspond to the RV and the blue die correspond to • Event: 7 7 7 7

  8. Axioms of Probability • Let be a random variable with finite domain . • A probability distribution over assigns a value to every s.t. • For any event we have • In particular

  9. Joint Probability • Given two random variables and , the joint probability of an atomic event is • In particular

  10. Posterior/Conditional Probability Probability that an event occurs, given that some other event occurs. 10 10 10 10

  11. Posterior/Conditional Probability assuming that Bayes rule: Chain rule: derived by successive application of Bayes’ rule:

  12. Independence Two events and are independent if . Equivalent to “Knowing adds no information about ” Rolling two dice

  13. Bayesian Inference Instead of inferring statements of the form ‘is true given knowledge base?’ ? we infer statements of the form ‘What is the likelihood of an event given the probabilities of other events?’

  14. Inference by Enumeration • Start with the joint probability distribution: • For any proposition (event) , sum the atomic events where holds:

  15. Inference by Enumeration • Start with the joint probability distribution: • For any proposition (event) , sum the atomic events where holds:

  16. Inference by Enumeration • Start with the joint probability distribution: • For any proposition (event) , sum the atomic events where holds:

  17. The Power of Independence • Suppose that we have random variables with domains of size each. How big is their joint distribution table? • Suppose that we know that are independent, does this change our answer? Need to maintain only values and we can compute the joint distribution. • Independence is good (if you can find it)

  18. Conditional Independence • Suppose that we test for pneumonia with two tests • Blood Test: • Throat Swab: • Not fully independent: positive blood test may imply higher chance of positive swab. • BUT: independent given knowledge of ! “Tests were conducted independently, and are only related by the underlying sickness”

  19. Conditional Independence • Write out full joint distribution using chain rule: • Joint distribution of RVs: entries. • Conditional independence: linear! • Conditional independence is more robust and common than absolute independence

  20. Bayes’ Rule and Conditional Independence A cause (heavy rain) can have several conditionally independent effects (Alice takes umbrella, Bob takes umbrella, Claire takes umbrella…) • This is an example of a naive Bayes model:

  21. Normalization • We are trying to diagnose the disease . of the population is healthy, are carriers, and are sick. • A blood test will come back positive with the following probability: • We run a test three times (independently) and obtain two positive (on tests 1 and 2) and one negative (on test 3). What is the likeliest value for ?

  22. Normalization We don’t care about ! Set it to

  23. BAYESIAN NETWORKS AIMA Chapter 14.1 – 14.2

  24. Bayesian Networks • A graphical way of writing joint distributions • Nodes are random variables • Edge from to : directly influences • a conditional distribution for each node given its parents: • In the simplest case, conditional distribution can be represented as a conditional probability table (CPT): the distribution over for each combination of parent values

  25. Bayesian Networks Given , write

  26. Examples Independent causes: “I can be late either because of rain or because I was sick”

  27. Examples Conditionally independent effects: “A disease can cause two independent tests to be positive”

  28. Example With More Variables • I'm at work • neighbor John calls to say my house alarm is ringing • neighbor Mary doesn't call • Alarm sometimes set off by minor earthquake. • Is there a burglar? • Variables: , , , , • 5 binary variables: joint distribution table of size • Exploit domain knowledge; get a smaller representation.

  29. Compactness • A CPT for a Boolean with Boolean parents has rows for the combinations of parent values • Each row requires one number for • If each variable has no more than parents, the complete network requires numbers, as compared to for the full joint distribution. • For burglary network, numbers as compared to numbers for full joint distribution

  30. Inference in Bayesian Network A Bayesian Network represents the full joint distribution; can infer any query. e.g. Need to compute the cases .

  31. Constructing Bayesian Networks 1. Choose an ordering of variables 2. For • Add node to the network • Select minimal set of parents from such that • Link every parent to • Write down CPT for

  32. Constructing Bayesian Networks Consequence of chain rule, generally true! This construction guarantees Network is acyclic (why??), and has no redundancies By choice of parents

  33. Variable Order Matters We choose the ordering (originally was ) Is it true that ?

  34. Variable Order Matters We choose the ordering Is it true that ?

  35. Variable Order Matters We choose the ordering ?

  36. Variable Order Matters We choose the ordering ?

  37. The Markov Blanket A node is conditionally independent of everything else given the values of its: • parents • children • its children’s parents

  38. Putting it All Together We want to compute • Bayes’ rule: • Total Probability: • Bayesian Network Factoring:

More Related