180 likes | 392 Vues
Belief Networks. Done by: Amna Omar Abdullah Fatima Mahmood Al-Hajjat Najwa Ahmad Al-Zubi. Definition of Belief Networks:.
E N D
Belief Networks • Done by: • Amna Omar Abdullah • Fatima Mahmood Al-Hajjat • Najwa Ahmad Al-Zubi
Definition of Belief Networks: • A belief network: is a directed acyclic graph (DAG) which uses a graphical representation to represent probabilistic structure. It consists of vertices (or nodes)that are connected bycausal links- represented byarrows - which point fromparent nodes(causes) tochild nodes(effects).
Each node is used to represent a random variable, and each directed edge represents an immediate dependence or direct influence. • How to design a Belief Network? • • Explore the causal relations
Example • Holmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained. Rain Sprinklers Defining casual components Watson’s lawn wet Holme’s lawn wet
Cont’d • Holmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained. Rain Sprinklers Define the probabilities Watson’s lawn wet Holme’s lawn wet
Cont’d • Holmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained. Rain Sprinklers Given w, P(R) goes up and P(S) goes down Watson’s lawn wet Holme’s lawn wet
Joint Probability Distribution: • Belief network is related to the joint probability distribution that: • 1- Representing information in a more understandable and logical manner. • 2- making construction and interpretation much simpler, It does not fully specify it. What it does is give the conditional dependencies in the probability distribution. • On top of these direct dependence relationships, we need to give some numbers (or functions) which define how the variables relate to one another.
Joint distribution: • •P(X1, X2, …, Xn) • • Probability assignment to all combinations of values of random variables and provide complete information about the probabilities of its random variables. • • A JPD table for nrandom variables, each ranging over kdistinct values, has kⁿentries!
Using the joint distribution • A joint probability distribution P(A,B,C,D,E) over a number of variables can be written in the following form using the chain rule of probability: • P(A,B,C,D,E) = P(A) P(B|A) P(C|B,A) P(D|C,B,A) P(E|D,C,B,A).
Probability theory • •Conditioning • • P(A) = P(A | B) P(B) + P(A | :B) P(:B) • • A and B are independentiff • • P(A | B) = P(A) • • P(B | A) = P(B) • • A and B are conditionally independentgiven C iff • • P(A | B, C) = P(A | C) • • P(B | A, C) = P(B | C) • • Bayes’ Rule • • P(A | B) = P(B | A) P(A) / P(B) • • P(A | B, C) = P(B | A, C) P(A | C) / P(B | C)
Cont’d • Variables: V1, …, Vn • • Values: v1, …, vn • • P(V1=v1, V2=v2, …, Vn=vn) = Πi P(Vi=vi | parents(Vi)) • P(ABCD) = P(A=true, B=true, • C=true, D=true) • P(ABCD) =P(D|ABC)P(ABC) P(A) A B P(B) P(C|A,B) C D P(D|C)
P(ABCD) = • P(D|ABC)P(ABC) = • P(D|C) P(ABC) = • A independent from D given C • B independent from D given C P(A) A B P(B) P(C|A,B) C D P(D|C)
P(ABCD) = • P(D|ABC)P(ABC) = • P(D|C) P(ABC) = • P(D|C) P(C|AB) P(AB) = • A independent from D given C • B independent from D given C P(A) A B P(B) P(C|A,B) C D P(D|C)
Cont’d • P(ABCD) = • P(D|ABC)P(ABC) = • P(D|C) P(ABC) = • P(D|C) P(C|AB) P(AB) = • P(D|C) P(C|AB) P(A)P(B) • A independent from B P(A) A B P(B) P(C|A,B) C D P(D|C)
Probability that Watson Crashes Icy P(I)=0.7 Watson Crash Holmes Crash P(W) = P(W| I) P(I) + P(W|-I) P(-I) = 0.8*0.7 + 0.1*0.3 = 0.56 + 0.03 = 0.59
Cont’d • P(I | W) = P(W | I) P(I) / P(W) • = 0.8*0.7 / 0.59 • = 0.95 • We started with P(I) = 0.7; knowing that Watson crashed raised the probability to 0.95 Icy P(I)=0.7 Watson Crash Holmes Crash
Thank you for listening, if you have any questions don’t hesitate to ask Mr. Ashraf.