1 / 17

Belief Networks

Belief Networks. Done by: Amna Omar Abdullah Fatima Mahmood Al-Hajjat Najwa Ahmad Al-Zubi. Definition of Belief Networks:.

hollis
Télécharger la présentation

Belief Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Belief Networks • Done by: • Amna Omar Abdullah • Fatima Mahmood Al-Hajjat • Najwa Ahmad Al-Zubi

  2. Definition of Belief Networks: • A belief network: is a directed acyclic graph (DAG) which uses a graphical representation to represent probabilistic structure. It consists of vertices (or nodes)that are connected bycausal links- represented byarrows - which point fromparent nodes(causes) tochild nodes(effects).

  3. Each node is used to represent a random variable, and each directed edge represents an immediate dependence or direct influence. • How to design a Belief Network? • • Explore the causal relations

  4. Example • Holmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained. Rain Sprinklers Defining casual components Watson’s lawn wet Holme’s lawn wet

  5. Cont’d • Holmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained. Rain Sprinklers Define the probabilities Watson’s lawn wet Holme’s lawn wet

  6. Cont’d • Holmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained. Rain Sprinklers Given w, P(R) goes up and P(S) goes down Watson’s lawn wet Holme’s lawn wet

  7. Joint Probability Distribution: • Belief network is related to the joint probability distribution that: • 1- Representing information in a more understandable and logical manner. • 2- making construction and interpretation much simpler, It does not fully specify it. What it does is give the conditional dependencies in the probability distribution. • On top of these direct dependence relationships, we need to give some numbers (or functions) which define how the variables relate to one another.

  8. Joint distribution: • •P(X1, X2, …, Xn) • • Probability assignment to all combinations of values of random variables and provide complete information about the probabilities of its random variables. • • A JPD table for nrandom variables, each ranging over kdistinct values, has kⁿentries!

  9. Using the joint distribution • A joint probability distribution P(A,B,C,D,E) over a number of variables can be written in the following form using the chain rule of probability: • P(A,B,C,D,E) = P(A) P(B|A) P(C|B,A) P(D|C,B,A) P(E|D,C,B,A).

  10. Probability theory • •Conditioning • • P(A) = P(A | B) P(B) + P(A | :B) P(:B) • • A and B are independentiff • • P(A | B) = P(A) • • P(B | A) = P(B) • • A and B are conditionally independentgiven C iff • • P(A | B, C) = P(A | C) • • P(B | A, C) = P(B | C) • • Bayes’ Rule • • P(A | B) = P(B | A) P(A) / P(B) • • P(A | B, C) = P(B | A, C) P(A | C) / P(B | C)

  11. Cont’d • Variables: V1, …, Vn • • Values: v1, …, vn • • P(V1=v1, V2=v2, …, Vn=vn) = Πi P(Vi=vi | parents(Vi)) • P(ABCD) = P(A=true, B=true, • C=true, D=true) • P(ABCD) =P(D|ABC)P(ABC) P(A) A B P(B) P(C|A,B) C D P(D|C)

  12. P(ABCD) = • P(D|ABC)P(ABC) = • P(D|C) P(ABC) = • A independent from D given C • B independent from D given C P(A) A B P(B) P(C|A,B) C D P(D|C)

  13. P(ABCD) = • P(D|ABC)P(ABC) = • P(D|C) P(ABC) = • P(D|C) P(C|AB) P(AB) = • A independent from D given C • B independent from D given C P(A) A B P(B) P(C|A,B) C D P(D|C)

  14. Cont’d • P(ABCD) = • P(D|ABC)P(ABC) = • P(D|C) P(ABC) = • P(D|C) P(C|AB) P(AB) = • P(D|C) P(C|AB) P(A)P(B) • A independent from B P(A) A B P(B) P(C|A,B) C D P(D|C)

  15. Probability that Watson Crashes Icy P(I)=0.7 Watson Crash Holmes Crash P(W) = P(W| I) P(I) + P(W|-I) P(-I) = 0.8*0.7 + 0.1*0.3 = 0.56 + 0.03 = 0.59

  16. Cont’d • P(I | W) = P(W | I) P(I) / P(W) • = 0.8*0.7 / 0.59 • = 0.95 • We started with P(I) = 0.7; knowing that Watson crashed raised the probability to 0.95 Icy P(I)=0.7 Watson Crash Holmes Crash

  17. Thank you for listening, if you have any questions don’t hesitate to ask Mr. Ashraf.

More Related