1 / 30

FT228/4 Knowledge Based Decision Support Systems

FT228/4 Knowledge Based Decision Support Systems . Uncertainty Management in Rule-Based Systems Bayesian Reasoning. Ref: Artificial Intelligence A Guide to Intelligent Systems Michael Negnevitsky – Aungier St. Call No. 006.3. Uncertainty ?. Approximate Reasoning Inexact Reasoning

jmalone
Télécharger la présentation

FT228/4 Knowledge Based Decision Support Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FT228/4 Knowledge Based Decision Support Systems Uncertainty Management in Rule-Based Systems Bayesian Reasoning Ref: Artificial Intelligence A Guide to Intelligent Systems Michael Negnevitsky – Aungier St. Call No. 006.3

  2. Uncertainty ? • Approximate Reasoning • Inexact Reasoning • Information available to human expert • Incomplete • Inconsistent • Uncertain • All of the above • Information of this nature is unsuitable to solving problems

  3. Sources of uncertainty in expert systems • Weak Implications • Often vague associations between IF and THEN parts of rules • Need to be able to include certainty factors to indicate a degree of correlation • Imprecise Language • Natural language inherently ambiguous and imprecise e.g. often, sometimes, never • Difficult to express as IF THEN rules • Quantifying meaning of terms enables expert systems to establish appropriate matching of antecedents to facts in database

  4. Sources of uncertainty in expert systems • Unknown Data • In case of incomplete or missing data must accept the value ‘unknown’ and proceed to approximate reasoning with this value • Combining Views of different experts • Experts seldom reach same conclusions • Have contradictory opinions and produce conflicting rules • Weight must be attached to each expert and this factored into conclusion • No systematic method to obtain weights

  5. Statistics • Using probability theory we can determine the chances of event occuring • Probability • Likelihood of being realized • Proportion of cases in which event occurs • Situations when probability is appropriate • Genuinly random world e.g. cards • Normal world, impossible to measure all causes and effects • Exceptions to normal relationships • Basis for learning

  6. Probability Theory • Can be expressed mathematically as a numerical index with a range • Zero (absolute impossibility) • Unity (absolute certainty) • Most events have probability index strictly between 1 and 0 • Each event has at least 2 possible outcomes • Success or failure

  7. The number of successes The number of failures p(success) = p(failure) = The number of possible outcomes The number of possible outcomes s f p(s) = p = p(f) = q = s + f s + f Probability Theory p + q = 1

  8. The number of times A and B can occur p(A|B) = The number of times B can occur Probability Theory – dependent events • Let A & B be events • A & B occur conditionally on the occurrence of the other • Probability that A will occur if B occurs is called conditional probability • p(A|B) where |denotes GIVEN • Reads as probability that A will occur given that B has occurred

  9. p(B p(A U U A) B) p(B|A) = p(A|B) = p(B) p(A) p(B|A) x p(A) p(B|A) x p(A) p(A p(B U U B)= A)= p(A|B) = p(B|A) x p(A) p(B) Probability Theory – dependent events • Probability that A and B can occur is called joint probability • p(A B) U

  10. p(A p(A p(A U U U Bi) Bi) B) n n n p(A|B) = p(B|A) x p(A) = p(A|Bi) x p(Bi)    i=1 i=1 i=1 p(B) Probability Theory – dependent events • Bayesian rule p(A|B) = p(B) = p(A)

  11. Bayesian Reasoning • Assuming a random sampling of events, Bayesian theory supports the calculation of more complex probabilities from previously known results • E.g. in card game of four people where all cards are equally distributed, if I do not have Queen of Hearts, each other person has a 1/3 probability of having it, and also a 1/9 probability of having the Queen and the Ace of Hearts, assuming having both cards are independent events • probability (A & B) = probability(A) x probability(B), given that A & B are independent

  12. Bayesian Reasoning • Prior probability • or unconditioned probability, of an event is the probability assigned to an event in the absence of knowledge supporting its occurance or absence, i.e. the probability of an event prior to any evidence : p(event) • Posterior probability • or after the fact probability, or conditional probability, of an event is the probability of an event given some evidence : p(event|evidence) • The prior probability of a person having a disease is the number of people in the domain, divided into the number of people with the disease.

  13. Bayesian Reasoning • The posterior probability of a person having a disease d with symptom s is: p(d|s) = |d Λs| |s| • where || indicates the number of elements in the set i.e. the number of people having both disease d and symptom s divided by the total number of people with symptom s • Bayes, calculates p(d|s) as p(d|s) = p(d) x p(s|d) p(s)

  14. Bayesian Reasoning • The numbers on the right hand side of the equation are easy to come by • e.g. it is much easier to determine the number of meningitis patients who suffer headaches, than it is to discove the number of headache sufferers who suffer from meningitis. • Also, not many numbers are needed. Trouble begins when you consider multiple diseases and multiple symptoms p(d|s1 & s2 &…& sn) = p(d)p(s1 & s2 &…& sn | d) p(s1 & s2 &…& sn) • Large number of probabilities required

  15. Bayesian Reasoning • In many diagnostic situations we must also deal with negative information e.g. when a patient does not have a given symptom, we require p(not S) = 1 – p(S) p(not d|s) = 1 – p(d|s)

  16. Bayesian Reasoning • Suppose all rules in a knowledge base are expressed as follows • IF E IS TRUE THEN H IS TRUE {WITH PROBABILITY p} • What if event E has occurred but we don’t know whether H has occurred ? Can we compute a probability that event H has occurred as well ?

  17. P(H|E) = P(E|H) x P(H) P(E|H) x P(H) + p(E|H’) x p(H’) Bayesian Reasoning • Instead of using events A and B use Hypothesis H and Evidence E • Where: • P(H|E) is the probability that H is true given E • P(H) is the probability that H is true overall • P(E|H) is the probability of observing E when H is true • P(H’) is the probability of H being false • P(E|H’) is the probability of observing E even when H is false

  18. Bayes Theorem p(H|E) = p(E|H) x p(H) p(E|H) x p(H) + p(E|H’) x p(H’) • p(H) is the prior probability of hyphothesis H being true • p(E|H) is the probability of hyphothesis H being true will result in evidence E • p(H’) is the prior probability of hyphothesis H being false • p(E|H’) is the probabilityof finding evidence E even when hyphothesis H is false

  19. Example I • Assume the following probabilities for product failure to levels of contamination in manufacturing p(failure) Level of Contamination 0.1 High 0.01 Medium 0.001 Low • In a particular run, 20% of the chips are subjected to high levels of contamination, 30% to medium and 50% to low levels of contamination • If a semiconductor chip in the product fails, what is the probability that the chip was exposed to high levels of contamination?

  20. Example I p(H|F) = p(F|H)p(H)/p(F) (0.1)(0.2)/p(F) p(F) = p(F|H)p(H) + p(F|M)p(M) + p(F|L)p(L) = (0.1)(0.2) + (0.01)(0.3) + (0.001)(0.5) = 0.0235 p(H|F) = (0.1)(0.2)/(0.0235) = 0.85

  21. Example II • A medical procedure has been shown to be highly effective in early detection of an illness. • The probability that the test correctly identifies someone with the illness as positive is 0.99. • The probability that the test correctly identifies someone without the illness as negative is 0.95. • The incidence of the illness in the population is 0.0001. • You take the test and the result is positive. • What is the probability you have the illness?

  22. Example II • Let D denote the event you have the illness • Let S denote the event that the test signals positive. • You have to establish p(D|S). • The probability that the test correctly signals someone without the illness as negative is 0.95, therefore, the probability of a positive test without the illness is 0.05 • p(S|D’) = 0.05 • From Bayes… p(D|S) = p(S|D)p(D) / [p(S|D)p(D) + p(S|D’)p(D’) = (.99)(.0001) / [(.99)(.0001) + (.05)(1 – .0001) = 1/506 = 0.02

  23. Multiple Hypotheses • What if expert, based on single evidence E, cannot choose between single hypotheses H1 – Hn ? • Or if given multiple evidences E1- En the expert can also produce multiple hypotheses ?

  24. Bayes Theorem • Bayes provides a way of computing the probability of an hypothesis Hi, following from a particular piece of evidence, given only the probabilities with which the evidence follows from actual causes (hypotheses) P(Hi|E) = P(E|Hi) x P(Hi) P(E|Hk) x P(Hk) • Where: • P(Hi|E) is the probability that Hi is true given E • P(Hi) is the probability that Hi is true overall • P(E|Hi) is the probability of observing E when Hi is true • n is the number of hypotheses

  25. Multiple Hypotheses P(Hi|E1E2…En) = P(E1|Hi) x P(E2|Hi)x…xP(En |Hi) P(E1|Hi) x P(E2|Hi)x…xP(En |Hi) • Suppress subtlies of evidence and assume conditional indepdence among different evidences

  26. Multiple Hypotheses • How does expert system compute and rank all potentially true hypotheses ? • Given the prior probabilities • Determine the conditional probabilities • Calculate the posterior probabilities • Rank posteriors

  27. Bayesian Reasoning • Two major requirements for Bayes’ theorem • All the probabilities on the relationships of evidence with the various hypotheses must be known, as well as the probabilistic relationships among the pieces of evidence • All relationships between hypotheses and evidence, P(E|Hk) must be independent. This assumption of independence must be justified, which is difficult. • Most expert systems rely on heuristics to augment Bayes’ Theorem.

  28. Bayesian method • Requires probability values as primary inputs • Values usually involves human judgement • Humans cannot elicit probabilities consistent with Bayesian rules • Or are just really bad at it • Domain experts do not deal easily with conditional probabilities • Often deny existence of hidden implicit probabilities

  29. Bayesian Reasoning s1 s2 s3 s4 s5 … sn cause1 cause2 … causem Non expert view of symptoms and causes

  30. Bayesian Reasoning s1 s2 s3 s4 s5 … sn I1 I2 … I3 cause1 cause2 … causem Expert view, related symptoms grouped together Into intermediate, pathological states Makes inference more manageable

More Related