1 / 14

For Monday

For Monday. Read Chapter 13 Chapter 11, exercise 4. Program 3. Homework. Example. Op( Action: Go(there); Precond: At(here); Effects: At(there), ¬At(here) ) Op( Action: Buy(x), Precond: At(store), Sells(store,x); Effects: Have(x) ) A 0 :

xanthe
Télécharger la présentation

For Monday

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. For Monday • Read Chapter 13 • Chapter 11, exercise 4

  2. Program 3

  3. Homework

  4. Example Op( Action: Go(there); Precond: At(here); Effects: At(there), ¬At(here) ) Op( Action: Buy(x), Precond: At(store), Sells(store,x); Effects: Have(x) ) • A0: • At(Home) Sells(SM,Banana) Sells(SM,Milk) Sells(HWS,Drill) • A¥ • Have(Drill) Have(Milk) Have(Banana) At(Home)

  5. Example Steps • Add three buy actions to achieve the goals • Use initial state to achieve the Sells preconditions • Then add Go actions to achieve new pre-conditions

  6. Handling Threat • Cannot resolve threat to At(Home) preconditions of both Go(HWS) and Go(SM). • Must backtrack to supporting At(x) precondition of Go(SM) from initial state At(Home) and support it instead from the At(HWS) effect of Go(HWS). • Since Go(SM) still threatens At(HWS) of Buy(Drill) must promote Go(SM) to come after Buy(Drill). Demotion is not possible due to causal link supporting At(HWS) precondition of Go(SM)

  7. Example Continued • Add Go(Home) action to achieve At(Home) • Use At(SM) to achieve its precondition • Order it after Buy(Milk) and Buy(Banana) to resolve threats to At(SM)

  8. Uncertainty • Everyday reasoning and decision making is based on uncertain evidence and inferences. • Classical logic only allows conclusions to be strictly true or strictly false • We need to account for this uncertainty and the need to weigh and combine conflicting evidence.

  9. Coping with Uncertainty • Straightforward application of probability theory is impractical since the large number of conditional probabilities required are rarely, if ever, available. • Therefore, early expert systems employed fairly ad hoc methods for reasoning under uncertainty and for combining evidence. • Recently, methods more rigorously founded in probability theory that attempt to decrease the amount of conditional probabilities required have flourished.

  10. Probability • Probabilities are real numbers 0­1 representing the a priori likelihood that a proposition is true. P(Cold) = 0.1 P(¬Cold) = 0.9 • Probabilities can also be assigned to all values of a random variable (continuous or discrete) with a specific range of values (domain), e.g. low, normal, high. P(temperature=normal)=0.99 P(temperature=98.6) = 0.99

  11. Probability Vectors • The vector form gives probabilities for all values of a discrete variable, or its probability distribution. P(temperature) = <0.002, 0.99, 0.008> • This indicates the prior probability, in which no information is known.

  12. Conditional Probability • Conditional probability specifies the probability given that the values of some other random variables are known. P(Sneeze | Cold) = 0.8 P(Cold | Sneeze) = 0.6 • The probability of a sneeze given a cold is 80%. • The probability of a cold given a sneeze is 60%.

  13. Cond. Probability cont. • Assumes that the given information is all that is known, so all known information must be given. P(Sneeze | Cold Ù Allergy) = 0.95 • Also allows for conditional distributions P(X |Y) gives 2­D array of values for all P(X=xi|Y=yj) • Defined as P (A | B) = P (A Ù B) P(B)

  14. Axioms of Probability Theory • All probabilities are between 0 and 1. 0  P(A)  1 • Necessarily true propositions have probability 1, necessarily false have probability 0. P(true) = 1 P(false) = 0 • The probability of a disjunction is given by P(A  B) = P(A) + P(B) - P(A ÙB)

More Related