1 / 12

Knowledge Representation and Reasoning

Knowledge Representation and Reasoning. Logic and Reasoning in Uncertain Conditions. Harris Georgiou – xgeorgio@di.uoa.gr. Lecture 5. 2 – 12. Reasoning in Uncertainty. Probabilities and Uncertainty The Bayes rule Measures of Belief and Disbelief Certainty Factors (CF)

Télécharger la présentation

Knowledge Representation and Reasoning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Knowledge Representationand Reasoning Logic and Reasoning in Uncertain Conditions Harris Georgiou – xgeorgio@di.uoa.gr Lecture 5

  2. 2 – 12 Reasoning in Uncertainty • Probabilities and Uncertainty • The Bayes rule • Measures of Belief and Disbelief • Certainty Factors (CF) • Hypotheses and Evidences • Logical operators and CF • Inference rules and CF Harris Georgiou – xgeorgio@di.uoa.gr

  3. 3 – 12 Probability Theory • In real world, events happen with a certain probability, usually the result of many factors • Some of these factors are independent with each other, i.e. happen independently • Others usually happen (or not happen) together at the same time. • Example: the event “rain” happens with a probability that depends on dependent factors “cloud” and “cold” Harris Georgiou – xgeorgio@di.uoa.gr

  4. 4 – 12 The Bayes Theorem • In order to calculate “joined” probabilities we use the Bayes rule: P(H|Ek) = P(Ek|H)P(H) / SUM{P(Ei|H)P(H)} H = hypothesis (e.g. H=“rains”) Ek = evidence k (e.g. Ek=“cloud”) H|Ek = propability of “rain” when there is “cloud” Ek|H = probability of “cloud” when “rain” (past history) Harris Georgiou – xgeorgio@di.uoa.gr

  5. 5 – 12 Certainty Factors • Bayes rule becomes too complex when there are many evidences Ei that are dependent • Instead, we often use approximate models for describing levels of certainty • Certainty Factors (Shortliffe, 1976): first used in the expert system MYCIN • They describe hypotheses and evidences in terms of “certainty” about their truth Harris Georgiou – xgeorgio@di.uoa.gr

  6. 6 – 12 Measures of Belief and Disbelief Measure of Belief (MB): MB(H|E) = 1 , if P(H)=1 = max{P(H|E),P(H)} / (1-P(H)) , if P(H)<1 Measure of Disbelief (MD): MD(H|E) = 1 , if P(H)=0 = (P(H) - min{P(H|E),P(H)}) / P(H) , if P(H)<1 Harris Georgiou – xgeorgio@di.uoa.gr

  7. 7 – 12 Certainty Factors Certainty Factor (CF): CF(H|E) = MB(H|E) – MD(H|E) 0 <= MB(H|E) <= 1 , 0 <= MD(H|E) <= 1 thus: -1 <= CF(H|E) < +1 Harris Georgiou – xgeorgio@di.uoa.gr

  8. 8 – 12 CF: Multiple evidences What if new evidence Ek becomes available? MB(H|E1,E2) = MB(H|E1)+MB(H|E2)*[1-MB(H|E1)] MD(H|E1,E2) = MD(H|E1)+MD(H|E2)*[1-MD(H|E1)] CF(H|E1,E2) = MB(H|E1,E2)-MD(H|E1,E2) Harris Georgiou – xgeorgio@di.uoa.gr

  9. 9 – 12 CF: Uncertain evidences What if evidence E is not 100% certain to be true ? MB(H|E) = normal MB value for H based on E • If E comes from “measurements” M then: CF(E|M) = certainty of E based on M • The updated value for MB(H|E) now becomes: MB(H|E)new= MB(H|E) * max{ 0, CF(E|M) } Harris Georgiou – xgeorgio@di.uoa.gr

  10. 10 – 12 CF: Composite hypotheses What if combosite hypotheses are to be evaluated? MB(H1 AND H2 |E) = min{ MB(H1|E), MB(H2|E) } (similarly for MD) MB(H1 OR H2 |E) = max{ MB(H1|E), MB(H2|E) } (similarly for MD) MB(NOT H|E) = 1 - MB(H|E) (similarly for MD) Harris Georgiou – xgeorgio@di.uoa.gr

  11. 11 – 12 CF: IF-THEN inference rules How do we calculate CF for logical IF-THEN rules? rule(R): IF “E” THEN “H” with certainty CF(R) • Case: CF(H)>0,CF(R)>0 then: CF(H|E) = CF(H)+CF(R)*(1-CF(H)) • Case: CF(H)<0,CF(R)<0 then: CF(H|E) = CF(H)+CF(R)*(1-CF(H)) • Case: CF(H)*CF(R)<0 (i.e. different signs) then: CF(H|E) = (CF(H)+CF(R)) / (1-min{|CF(H)|,|CF(R)|}) Harris Georgiou – xgeorgio@di.uoa.gr

  12. 12 – 12 P.C. – Readings • S. J. Russell, P. Norvig, “Artificial Intelligence: A Modern Approach”, 2nd/Ed, Prentice Hall, 2002. [see: ch.19] Harris Georgiou – xgeorgio@di.uoa.gr

More Related