1 / 42

Cooperating Intelligent Systems

Cooperating Intelligent Systems. Uncertainty & probability Chapter 13, AIMA. ”When an agent knows enough facts about its environment, the logical approach enables it to derive plans that are guaranteed to work.”.

phallon
Télécharger la présentation

Cooperating Intelligent Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cooperating Intelligent Systems Uncertainty & probability Chapter 13, AIMA

  2. ”When an agent knows enough facts aboutits environment, the logical approach enablesit to derive plans that are guaranteed to work.” ”Unfortunately, agents almost never have accessto the whole truth about the environment.” From AIMA, chapter 13

  3. Airport example Slide from S. Russell

  4. Why FOL fails? Laziness: We can’t be bothered to list all possible requirements. Ignorance: We have no complete (nor tested) theoretical model for the domain. Practical reasons: Even if we knew everything and could be persuaded to list everything, the rules would be totally impractical to use (we can’t possibly test everything).

  5. Assign probability to a proposition based on the percepts (information). Proposition is either true or false. Probability means assigning a belief on how much we believe in it being either true or false. Evidence = information that the agent receives. Probabilities can (will) change when more evidence is acquired. Prior/unconditional probability ⇔ no evidence at all. Posterior/conditional probability⇔ after evidence is obtained. No plan may be guaranteed to achieve the goal. To make choices, the agent must have preferences between the different possible outcomes of various plans. Utility represents the value of the outcomes, and utility theory is used to reason with the resulting preferences. An agent is rational if and only if it chooses the action that yields the highest expected utility, averaged over all possible outcomes. Instead, use decision theory Decision theory = probability theory + utility theory Probability Utility Inspired by Sun-Hwa Hahn

  6. Basic probability notation • X : Random variable • Boolean: P(X=true), P(X=false) • Discrete: P(X=a), P(X=b), ... • Continuous: P(x)dx

  7. Boolean variable X∈ {True, False} Examples: P(Cavity) P(W31) P(Survive) P(CatchFlight)P(Cancer) P(LectureToday) ...

  8. Discrete variable X∈ {a1,a2,a3,a4,...} Examples: P(PlayingCard) P(Weather) P(Color) P(Age) P(LectureQual) ...

  9. Continuous variable X Probability density P(x) Examples: P(Temperature) P(WindSpeed) P(OxygenLevel) P(LengthLecture) ...

  10. Propositions Elementary:Cavity = True, W31 = True, Cancer = False, Card = A♠, Card = 4♥, Weather = Sunny, Age = 40, (20C < Temperature < 21C), (2 hrs < LengthLecture < 3 hrs) Complex: (Elementary + connective)¬Cancer ∧ Cavity, Card1 = A♠∧ Card2 = A♣, Sunny ∧ (30C < Temperature) ∧¬Beer Every proposition is either true or false. We apply a degree of belief in whether it is true or not. Extension of propositional calculus...

  11. World {Cavity, Toothache} cavity ∧ toothache cavity ∧¬toothache ¬cavity ∧ toothache ¬cavity ∧¬toothache Atomic event A complete specification of the state of the world. • Mutually exclusive • Exhaustive

  12. Note boldface Prior, posterior, joint probabilities • Prior probability P(X=a) = P(a). Our belief in X=a being true before information is collected. • Posterior probability P(X=a|Y=b) = P(a|b).Our belief that X=a is true when we know that Y=b is true (and this is all we know). • Joint probabilityP(X=a, Y=b) = P(a,b) = P(a ∧ b).Our belief that (X=a ∧ Y=b) is true.P(a ∧ b) = P(a,b) = P(a|b)P(b) Image borrowed from Lazaric & Sceffer

  13. Conditional probability(Venn diagram) P(A,B) = P(A ∧ B) P(A) P(B) P(A|B) = P(A,B)/P(B)

  14. Conditional probability examples • You draw two cards randomly from a deck of 52 playing cards. What is the conditional probability that the second card is an ace if the first card is an ace? • In a course I’m giving with oral examination the examination statistics over the period 2002-2005 have been: 23 have passed the oral examination in their first attempt, 25 have passed it their second attempt, 7 have passed it in their third attempt, and 8 have not passed it at all (after having failed the oral exam at least three times). What is the conditional probability (risk) for failing the course if the student fails the first two attempts at the oral exam? • (2005) 84% of the Swedish households have computers at home, 81% of the Swedish households have both a computer and internet. What is the probability that a Swedish household has internet if we know that it has a computer? Work these out...

  15. Conditional probability examples • You draw two cards randomly from a deck of 52 playing cards. What is the conditional probability that the second card is an ace if the first card is an ace? • In a course I’m giving with oral examination the examination statistics over the period 2002-2005 have been: 23 have passed the oral examination in their first attempt, 25 have passed it their second attempt, 7 have passed it in their third attempt, and 8 have not passed it at all (after having failed the oral exam at least three times). What is the conditional probability (risk) for failing the course if the student fails the first two attempts at the oral exam? • (2005) 84% of the Swedish households have computers at home, 81% of the Swedish households have both a computer and internet. What is the probability that a Swedish household has internet if we know that it has a computer? What’s the probability that both are aces?

  16. Conditional probability examples • You draw two cards randomly from a deck of 52 playing cards. What is the conditional probability that the second card is an ace if the first card is an ace? • In a course I’m giving with oral examination the examination statistics over the period 2002-2005 have been: 23 have passed the oral examination in their first attempt, 25 have passed it their second attempt, 7 have passed it in their third attempt, and 8 have not passed it at all (after having failed the oral exam at least three times). What is the conditional probability (risk) for failing the course if the student fails the first two attempts at the oral exam? • (2005) 84% of the Swedish households have computers at home, 81% of the Swedish households have both a computer and internet. What is the probability that a Swedish household has internet if we know that it has a computer?

  17. Conditional probability examples • You draw two cards randomly from a deck of 52 playing cards. What is the conditional probability that the second card is an ace if the first card is an ace? • In a course I’m giving with oral examination the examination statistics over the period 2002-2005 have been: 23 have passed the oral examination in their first attempt, 25 have passed it their second attempt, 7 have passed it in their third attempt, and 8 have not passed it at all (after having failed the oral exam at least three times). What is the conditional probability (risk) for failing the course if the student fails the first two attempts at the oral exam? • (2005) 84% of the Swedish households have computers at home, 81% of the Swedish households have both a computer and internet. What is the probability that a Swedish household has internet if we know that it has a computer?

  18. Conditional probability examples • You draw two cards randomly from a deck of 52 playing cards. What is the conditional probability that the second card is an ace if the first card is an ace? • In a course I’m giving with oral examination the examination statistics over the period 2002-2005 have been: 23 have passed the oral examination in their first attempt, 25 have passed it their second attempt, 7 have passed it in their third attempt, and 8 have not passed it at all (after having failed the oral exam at least three times). What is the conditional probability (risk) for failing the course if the student fails the first two attempts at the oral exam? • (2005) 84% of the Swedish households have computers at home, 81% of the Swedish households have both a computer and internet. What is the probability that a Swedish household has internet if we know that it has a computer? What’s the prior probability for not passing the course? =13%

  19. Inference • Inference means computing P(State of the world | Observed evidence)P(Y | e) For example: The probability for having a cavity if I have a toothache Or The probability for having a cavity if I have a toothache and the dentistfinds a catch in my tooth during inspection

  20. Inference w/ full joint distribution • The full joint probability distribution is the probability distribution for all random variables used to describe the world. Dentist example {Toothache, Cavity, Catch}

  21. Inference w/ full joint distribution • The full joint probability distribution is the probability distribution for all random variables used to describe the world. Dentist example {Toothache, Cavity, Catch} P(cavity) = 0.2 Marginal probability(marginalization)

  22. Inference w/ full joint distribution • The full joint probability distribution is the probability distribution for all random variables used to describe the world. Dentist example {Toothache, Cavity, Catch} P(cavity) = 0.2 Marginal probability(marginalization) P(toothache) = 0.2

  23. Inference w/ full joint distribution • The full joint probability distribution is the probability distribution for all random variables used to describe the world. Dentist example {Toothache, Cavity, Catch} Marginal probability(marginalization)

  24. Dentist example {Toothache, Cavity, Catch}

  25. Inference w/ full joint distribution The general inference procedure where a is a normalization constant. This can always be computed if the full joint probability distribution is available. Cumbersome...erhm...totally impractical impossible, O(2n) for process

  26. Independence Independence between variables can dramatically reduce the amount of computation. We don’t need to mix independent variables in our computations

  27. Independence for dentist example Oral health is independent of weather Image borrowed from Lazaric & Sceffer

  28. Bayes’ theorem

  29. Bayes theorem example Joe is a randomly chosen member of a large population in which 3% are heroin users. Joe tests positive for heroin in a drug test that correctly identifies users 95% of the time and correctly identifies nonusers 90% of the time. Is Joe a heroin addict? Example from http://plato.stanford.edu/entries/bayes-theorem/supplement.html

  30. Bayes theorem example Joe is a randomly chosen member of a large population in which 3% are heroin users. Joe tests positive for heroin in a drug test that correctly identifies users 95% of the time and correctly identifies nonusers 90% of the time. Is Joe a heroin addict? Example from http://plato.stanford.edu/entries/bayes-theorem/supplement.html

  31. Bayes theorem: The Monty Hall Game show In a TV Game show, a contestant selects one of three doors; behind one of the doors there is a prize, and behind the other two there are no prizes. After the contestant select a door, the game-show host opens one of the remaining doors, and reveals that there is no prize behind it. The host then asks the contestant whether he/she wants to SWITCH to the other unopened door, or STICK to the original choice. What should the contestant do? © Let’s make a deal (A Joint Venture) See http://www.io.com/~kmellis/monty.html

  32. The Monty Hall Game Show © Let’s make a deal (A Joint Venture)

  33. The Monty Hall Game Show © Let’s make a deal (A Joint Venture)

  34. The Monty Hall Game Show © Let’s make a deal (A Joint Venture)

  35. Bayes theorem: The Monty Hall Game show In a TV Game show, a contestant selects one of three doors; behind one of the doors there is a prize, and behind the other two there are no prizes. After the contestant select a door, the game-show host opens one of the remaining doors, and reveals that there is no prize behind it. The host then asks the contestant whether he/she wants to SWITCH to the other unopened door, or STICK to the original choice. What should the contestant do? The host is actually asking the contestant whether he/she wants to SWITCH the choice to both other doors, or STICK to the original choice. Phrased this way, it is obvious what the optimal thing to do is.

  36. Conditional independence We say that X and Y are conditionally independent if Z X Y

  37. Naive Bayes: Combining evidence Assume full conditional independence and express the full joint probability distribution as:

  38. Naive Bayes: Dentist example

  39. Naive Bayes: Dentist example 2 independent numbers 2 independent numbers 1 independent number Full table has 23-1=7 independent numbers [O(2n)]

  40. Use a dictionary with words (not too frequent and not to infrequent), e.g. w1 = airplane, w2 = algorithm, ... Estimate conditional probabilities P(wi | interesting) and P(wi | uninteresting) Compute P(text | interesting) and P(text | uninteresting) using Naive Bayes (and assuming that word position in text is unimportant) Naive Bayes application: Learning to classify text Where wi are the words occuring in the text.

  41. Then compute the probability that the text is interesting (or uninteresting) using Bayes’ theorem Naive Bayes application: Learning to classify text P(text) is just a normalization factor; it is notnecessary to compute it since we are only interested in if P(interesting | text) > P(uninteresting | text)

  42. Conclusions • Uncertainty arises because of laziness and ignorance: This is unavoidable in the real world. • Probability expresses the agent’s belief in different outcomes. • Probability helps the agent to act in a rational manner. Rational = maximize the own expected utility. • In order to get efficient algorithms to deal with probability we need conditional independences among variables.

More Related