1 / 20

Tues. March 9, 1999

Tues. March 9, 1999. The Logic of Probability Theory Foundations, Notation and Definitions Axioms and Theorems Conditional Probability, Independence Chain Rule Probability Trees Bayes’ Theorem Thursday, Discrete Random Events. Differing Views of Probabilities. Frequentist View

arien
Télécharger la présentation

Tues. March 9, 1999

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Tues. March 9, 1999 • The Logic of Probability Theory • Foundations, Notation and Definitions • Axioms and Theorems • Conditional Probability, Independence • Chain Rule • Probability Trees • Bayes’ Theorem • Thursday, Discrete Random Events

  2. Differing Views of Probabilities • Frequentist View P = W/N is the probability of event W, where in an experiment or trial: • W=number of events occuring • N=Total possible outcomes • Example: Role of a die thumbtack

  3. Subjective or Bayesian Definition • A probability is a measure assigned to an individual's subjective assessment of the likelihood of an event based on his state of information. The probability measure • (1) depends on the state of information, • (2) may change with new information, • (3) may vary among individuals, and • (4) corresponds to the areas on a Venn diagram.

  4. Conditional Probability Notation • I will use the following notation to represent probabilities: • Pr(A|B,S) Þ the probability of an event A given B has occurred for a given state of information S.

  5. (1) For any event A, the probability measure is a real non-negative number such that Pr(A|S) >0. (2) Pr(I|S) = 1, where I = the universe of all events. (3) If AB=ø (i.e., if A and B are mutually exclusive) then Pr(A+B|S) = Pr(A|S) + Pr(B|S). Axioms of Probability Theory

  6. Axioms in Terms of Venn Diagrams • (1) restricts the area in any region to be non-negative. • (2) requires that the area of the entire diagram, corresponding to the universe of all events "I", be normalized to one. • (3) implies that the area contained in two nonoverlapping regions be the sum of their individual areas. (1) Pr(A|S), Pr(B|S) ≥ 0 (2) Pr(I|S) = 1 (3) Pr(A+B|S) = Pr(A|S) + Pr(B|S) if A and B are mutually exclusive

  7. Fundamental Theorems • (1) Pr(A'|S) = 1 - Pr(A|S) • (2) Pr(ø|S) = 0 • (3) Pr(A + B|S) = Pr(A|S) + Pr(B|S) - Pr(AB|S)

  8. Conditional Probability Pr(AB|S) Pr(B|A, S) = ¾¾¾¾¾ Pr(A|S) Pr(AB|S) Pr(A|B, S) = ¾¾¾¾¾ Pr(B|S)

  9. Venn Diagrams and Conditional Probabilities • The conditional probability can be shown in the Venn diagram as the ratio of the area shared by both events (intersection) to the area of the conditioned event. We can look at conditional probability as a means of "renormalizing" the probability measure when new information shows that an event is known with certainty (i.e., with a probability equal to one).

  10. Independence • If the probability of the product of two events is the product of their probabilities, then the two events are independent. • Pr(AB|S)=Pr(A|S) Pr(B|S) implies independence • In other words, A is probabilistically independent of B if having knowledge about B gives you no new information about A and visa versa.

  11. Independence and Conditional Probability Pr(AB|S) Pr(A |S) Pr(B |S) Pr(B|A, S) = ¾¾¾¾¾¾ = ¾¾¾¾¾¾_________¾ = Pr(B|S) Pr(A|S) Pr(A| S) Pr(AB|S) Pr(A|S) Pr(B|S) Pr(A|B, S) = ¾¾¾¾¾¾ = ¾¾¾¾¾¾_________¾ = Pr(A|S) Pr(B|S) Pr(B|S) If A and B are independent (assuming nonzero probabilities).

  12. Influence Diagrams and Independence • Conditional probabilistic influence and independence can be explicitly represented as expansions of joint probability distributions in influence diagrams. • The strong piece of information is the lack of an influence or the identification of (conditional) independence. • We must assume that influences may be present if no information indicating independence is provided.

  13. Definition: Independence • Consider two (uncertain) state variables x and y. y does not influence x if x is independent of y given S, i.e., if Pr(x|y,S) = Pr(x|S).

  14. Chain Rule • The concept of conditional probability can be expanded to accommodate any number of events. For example, with three events A,B, and C: • Pr(ABC|S) = Pr(AB|C,S)Pr(C|S) = Pr(A|BC,S)Pr(B|C,S)Pr(C|S) [1] • Or Pr(ABC|S) = Pr(BC|A,S)Pr(A|S) = Pr(B|CA,S)Pr(C|A,S)Pr(A|S) [2]

  15. Three Influence Diagram Expansions for Pr(ABC|S) There are six different combinations possible if conditional independence is not considered. If conditional independence exists, any of the three arcs may be deleted. For example, the joint expansion of Pr(ABC|S) is as described in equation [3] below where B is independent of C: Pr(ABC|S) = Pr(AB|C,S) Pr(C|S) = Pr(A|BC,S) Pr(B|S)Pr(C|S)

  16. Probability Trees • A probability tree is a succession of circular nodes (uncertain state variables) with branches. • The branches emanating from each node represent the different possible values of the uncertain variables associated with the node.

  17. Probability Tree Pr(ABC|S) = Pr(A|BC,S)Pr(B|C,S)Pr(C|S)

  18. Bayes’ Theorem • Bayes' theorem is the single most important relation in probabilistic expert systems. • It forms the basis for probabilistic inference. • Consider N mutually exclusive and collectively exhaustive events: A1, A2, . . . An Ai Aj = ø, i≠j Î [1,N] (mutually exclusive) A1 + A2 + . . . + An = I (collectively exhaustive)

  19. Consider another event B that may overlap several events Ai, i=1,2 . . . If we know all the probabilities Pr(Ai|S) and Pr(B|Ai,S) (i.e., we know all of the corresponding area relationships in the Venn diagram above), then we know Pr(AiB|S) = Pr(Ai|S)Pr(B|Ai,S) Bayes’ Question

  20. We know: Pr(Ai|S) and Pr(B|Ai,S) and thus Pr(AiB|S) = Pr(Ai|S)Pr(B|Ai,S) What is Pr(Ai|B,S)? B B A A Bayes’ Question

More Related