1 / 64

OUTLINES

IN THE NAME OF ALLAH DECISION MAKING BY USING THE THEORY OF EVIDENCE STUDENTS: HOSSEIN SHIRZADEH, AHAD OLLAH EZZATI SUPERVISOR: Prof. BAGERI SHOURAKI SPRING 2009. OUTLINES. INTRODUTION BELIEF FRAMES OF DISCERNMENT COMBINIG THE EVIDENCE ADVANTAGES OF DS THEORY DISADVANTAGES OF DS THEORY

caradoc
Télécharger la présentation

OUTLINES

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IN THE NAME OF ALLAHDECISION MAKING BY USINGTHE THEORY OF EVIDENCESTUDENTS:HOSSEIN SHIRZADEH, AHAD OLLAH EZZATISUPERVISOR:Prof. BAGERI SHOURAKISPRING 2009

  2. OUTLINES • INTRODUTION • BELIEF • FRAMES OF DISCERNMENT • COMBINIG THE EVIDENCE • ADVANTAGES OF DS THEORY • DISADVANTAGES OF DS THEORY • BASIC PROBABLITY ASSIGNMENT • BELIEF FUNCTIONS • DEMPSTER RULE OF COMBINATION • ZADEH’S OBJECTION TO DS THEORY • GENERALIZED DS THEORY • AN APPLICATION OF DECISION MAKING METHOD

  3. INTRODUCTION • Introduced by Glenn Shafer in 1976 • “A mathematical theory of evidence” • A new approach to the representation of uncertainty • What means uncertainty? Most people don’t like uncertainty • Applications • Expert systems • Decision making • Image processing, project planning, risk analysis,…

  4. INTRODUCTION • All students of partial belief have tied it to Bayesian theory and • Committed to the value of idea and defend it • Rejected the theory (Proof of inviability)

  5. INTRODUCTIONBELIEF FUNCTION • : Finite set • Set of all subsets : • Then Bell is called belief function on

  6. INTRODUCTIONBELIEF FUNCTION • is called simple support function if • There exists a non-empty subset A of and that

  7. INTRODUCTIONTHE IDEA OF CHANCE • For several centuries the idea of numerical degree of belief has been identified with the idea of chance. • Evidence Theory is intelligible only if we reject this unification • Chance : • A random experiment : unknown outcome • The proportion of the time that a particular one the possible outcomes tends to occur

  8. INTRODUCTIONTHE IDEA OF CHANCE • Chance density • Set of all possible outcomes :X • Chance q(x) specified for each possible outcome • A chance density must satisfy :

  9. INTRODUCTIONTHE IDEA OF CHANCE • Chance function • Proportion of time that the actual outcome tends to be in a particular subset of X. • Ch is a chance function if and only it obeys the following

  10. INTRODUCTIONCHANCES AS DEGREES OF BELIEF • If we know the chances then we will surely adopt them as our degrees of belief • We usually don’t know the chances • We have little idea about what chance density governs a random experiment • Scientist is interested in a random experiment precisely because it might be governed by any one of several chance densities

  11. INTRODUCTIONCHANCES AS DEGREES OF BELIEF • Chances : • Features of the world • This is the way shafer addresses chance • Features of our knowledge or belief • Simon Laplace • Deterministic • Since the advent of Quantum mechanics this view has lost it’s grip on physics

  12. INTRODUCTIONBAYESIAN THEORY OF PARTIAL BELIEF • Very Popular theory of partial belief • Called Bayesian after Thomas Bayes • Adapts the three basic rules for chances as rules for one’s degrees of belief based on a given body of evidence. • Conditioning : changing one’s degree of belief when that evidence is augmented by the knowledge of a particular proposition

  13. INTRODUCTIONBAYESIAN THEORY OF PARTIAL BELIEF obey s When we learn that is true then

  14. INTRODUCTIONBAYESIAN THEORY OF PARTIAL BELIEF • The Bayesian theory is contained in Shafer’s evidence theory as a restrictive special case. • Why is Bayesian Theory too restrictive? • The representation of Ignorance • Combining vs. Conditioning

  15. INTRODUCTIONBAYESIAN THEORY OF PARTIAL BELIEFThe Representation of Ignorance In Evidence Theory • Belief functions • Little evidence: • Both the proposition and it’s negation have very low degrees of belief • Vacuous belief function

  16. INTRODUCTIONBAYESIAN THEORY OF PARTIAL BELIEFCombination vs. Conditioning • Dempster rule • A method for changing prior opinion in the light of new evidence • Deals symmetrically with the new and old evidence • Bayesian Theory • Bayes rule of conditioning • No Obvious symmetry • Must assume exact and full effect of the new evidence is to establish a single proposition with certainty

  17. INTRODUCTIONBAYESIAN THEORY OF PARTIAL BELIEFThe Representation of Ignorance • In Bayesian Theory: • Can not distinguish between lack of belief and disbelief • can not be low unless is high • Failure to believe A necessitates accordance of belief to • Ignorance represented by : • Important factor in the decline of Bayesian ideas in the nineteenth century • In DS theory

  18. Belief • The belief in a particular hypothesis is denoted by a number between 0 and 1 • The belief number indicates the degree to which the evidence supports the hypothesis • Evidence against a particular hypothesis is considered to be evidence for its negation (i.e., if Θ = {θ1, θ2, θ3}, evidence against {θ1} is considered to be evidence for {θ2, θ3}, and belief will be allotted accordingly)

  19. Frames of Discernment • Dempster - Shafer theory assumes a fixed, exhaustive set of mutually exclusive events • Θ = {θ1, θ2, ..., θn} • Same assumption as probability theory • Dempster - Shafer theory is concerned with the set of all subsets of Θ, known as the Frame of Discernment • 2Θ = {f, {θ1}, …, {θn}, {θ1,θ2}, …, {θ1, θ2, ... θn}} • Universe of mutually exclusive hypothesis

  20. Frames of Discernment • A subset {θ1, θ2, θ3} implicitly represents the proposition that one of θ1, θ2 or θn is the case • The complete set Θ represents the proposition that one of the exhaustive set of events is true • So Θ is always true • The empty set  represents the proposition that none of the exhaustive set of events is true • So  always false

  21. Combining the Evidence • Dempster-Shafer Theory as a theory of evidence has to account for the combination of different sources of evidence • Dempster & Shafer’s Rule of Combination is a essential step in providing such a theory • This rule is an intuitive axiom that can best be seen as a heuristic rule rather than a well-grounded axiom.

  22. Advantages of DS theory • The difficult problem of specifying priors can be avoided • In addition to uncertainty, also ignorance can be expressed • It is straightforward to express pieces of evidence with different levels of abstraction • Dempster’s combination rule can be used to combine pieces of evidence

  23. Disadvantages • Potential computational complexity problems • It lacks a well-established decision theory whereas Bayesian decision theory maximizing expected utility is almost universally accepted. • Experimental comparisons between DS theory and probability theory seldom done and rather difficult to do; no clear advantage of DS theory shown.

  24. Basic Probability Assignment • The basic probability assignment (BPA), represented as m, assigns a belief number [0,1] to every member of 2Θsuch that the numbers sum to 1 • m(A) represents the maesure of the belief that is committed exactly to A (to individual element A and to no smaller subset)

  25. Basic Probability AssignmentExample • suppose • Diagnostic problem • No information • 60 of 100 are blue • 30 of 100 are blue and rest of them are black or yellow 25

  26. Belief Functions • Obtaining the measure of the total belief committed to A: • Belief functions can be characterized without reference to basic probability assignments:

  27. Belief Functions • For Θ = {A,B} • BPA is unique and can recovered from the belief function

  28. Belief Functions • Focal element • A subset is a focal element if m(A)>0 • Core • The union of all the focal elements. • Theorem

  29. Belief FunctionsBelief Intervals • Ignorance in DS Theory: • The width of the belief interval: • The sum of the belief committed to elements that intersectA, but are not subsets of A • The width of the interval therefore represents the amount of uncertainty in A, given the evidence

  30. Belief FunctionsDegrees of Doubt and Upper Probabilities • One’s belief about a proposition A are not fully described by one’s degree of belief Bel(A) • Bel(A) does not reveal to what extend one doubts A • Degree of Doubt: • Upper probability: • The total probability mass that can move into A. 30

  31. Belief FunctionsDegrees of Doubt and Upper ProbabilitiesExample m({1, 2}) = – Bel({1}) – Bel({2}) + Bel({1, 2}) = – 0.1 – 0.2 + 0.4

  32. Belief Functions Bayesian Belief Functions • A belief function Bel is called Bayesian if Bel is a probability function. • The following conditions are equivalent • Bel is Bayesian • All the focal elements of Bel are singletons • For every A⊆Θ, • The inner measure can be characterized by the condition that the focal elements are pairwise disjoint.

  33. Belief Functions Bayesian Belief FunctionsExample • Suppose

  34. DempsterRule of Combination • Belief functions adapted to the representation of evidence because they admit a genuine rule of combination. • Several belief functions • Based on distinct bodies of evidence • Computing their “Orthogonal sum” using Dempster’s rule

  35. Dempster Rule of CombinationCombining Two Belief Functions • m1: basic probability assignment for Bel1 • A1,A2,…Ak: Bel1’s focal elements • m2: basic probability assignment for Bel2 • B1,B2,…Bl: Bel2’s focal elements

  36. Dempster Rule of CombinationCombining Two Belief Functions Probability mass measure of m1(Ai)m2(Bj) committed to

  37. Dempster Rule of CombinationCombining two Belief functions • The intersection of two strips m1(Ai) and m2(BJ) has measure m1(Ai)m2(BJ) , since it is committed to both Ai and to BJ , we say that the joint effect of Bel1 and Bel2 is to commit exactly to • The total probability mass exactly committed to A:

  38. Dempster Rule of CombinationCombining two Belief functionsExample

  39. Dempster Rule of Combination Combining two Belief functions • The only Difficulty • some of the squares may be committed to empty set • If Ai and Bjare focal elements of Bel1 and Bel2 and if then • The only Remedy: • Discard all the rectangles committed to empty set • Inflate the remaining rectangles by multiplying them with

  40. Dempster Rule of Combination The Weight of Conflict • The renormalizing factor measures the extent of conflict between two belief functions. • Every instance in which a rectangle is committed to  corresponds to an instance which Bel1 and Bel2 commit probability to disjoint subsets Ai and Bj

  41. Dempster Rule of Combination The Weight of Conflict (cont.) • Bel1 , Bel2 not conflict at all: • k = 0, Con(Bel1, Bel2)= 0 • Bel1 , Bel2 flatly contradict each other: • does not exist • k = 1, Con(Bel1, Bel2) = ∞ • In previous example k = 0.15

  42. Dempster’s rule of combination • Suppose m1 and m2 are basic probability functions over Θ. Then m1⊕m2 is given by • In previous example

  43. Dempster Rule of Combination An Application of DS Theory • Frame of Discernment: A set of mutually exclusive alternatives: • All subsets of FoD form:

  44. Dempster Rule of Combination An Application of DS Theory • Exercise deploys two “evidences in features” m1 and m2 • m1 is based on MEAN features from Sensor1 • m1 provides evidences for {SIT} and {¬SIT} ({¬SIT} = {STAND, WALK}) • m2 is based on VARIANCE features from Sensor1 • m2 provides evidences for {WALK} and {¬WALK } ({¬WALK } = {SIT, STAND})

  45. Dempster Rule of Combination An Application of DS Theory

  46. Dempster Rule of Combination An Application of DS TheoryCalculation of evidence m1 Bel(SIT) = 0.2 Pls(SIT) = 1 - Bel(¬SIT) = 0.5 Evidence z1=mean(S1) Concrete value z1(t) m1Concrete Value(SIT, ¬SIT, ) = (0.2, 0.5, 0.3)

  47. Dempster Rule of Combination An Application of DS TheoryCalculation of evidence m2 Evidence Bel(WALK) = 0.4 Pls(WALK) = 1-Bel(¬WALK) = 0.5 z2=variances(S1) Concrete value z2(t) m2Concrete Value(WALK, ¬WALK, ) = (0.4, 0.5, 0.1)

  48. Dempster Rule of Combination An Application of DS TheoryDS Theory Combination • Applying Dempster´s Combination Rule: • Due to m({})  Normalization with 0.92 (=1-0.08)

  49. Dempster Rule of Combination An Application of DS TheoryNormalized Values Belief(STAND) = 0.272 Plausibility(STAND) = 1 - (0.108+0.022+0.217+0.13) = 0.523

  50. Dempster Rule of Combination An Application of DS Theory Belief and Plausibility (SIT) Ground Truth: 1: Sitting; 2: Standing; 3: Walking

More Related