1 / 74

Kuliah 9 – Bayesian Network

Kuliah 9 – Bayesian Network. Yeni Herdiyeni Dept. Ilmu Komputer. Probabilities. Probability distribution P( X| x ) X is a random variable Discrete Continuous x is background state of information. Discrete Random Variables. Finite set of possible outcomes. X binary:.

levia
Télécharger la présentation

Kuliah 9 – Bayesian Network

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Kuliah 9 – Bayesian Network YeniHerdiyeni Dept. IlmuKomputer

  2. Probabilities • Probability distribution P(X|x) • X is a random variable • Discrete • Continuous • xis background state of information

  3. Discrete Random Variables • Finite set of possible outcomes X binary:

  4. Continuous Random Variable • Probability distribution (density function) over continuous values 5 7

  5. More Probabilities • Joint • Probability that both X=x and Y=y • Conditional • Probability that X=x given we know that Y=y

  6. Rules of Probability • Product Rule • Marginalization X binary:

  7. Bayes Rule

  8. Conditional Probability • If E and F are independent, then • Law of Total Probability

  9. Conditional Probability • Bayes’ Theorem: • Chain Rule

  10. Conditional Independence • Let E, F, and G be events. E and F are conditionally independentgivenG if • An equivalent definition is:

  11. Naïve Bayes Classifier Pada Naïve Bayes, variabelinputnyasalingindependent !

  12. Naïve Bayesvs Bayesian Network • Pada Naïve Bayes, mengabaikankorelasiantarvariable • diasumsikanpeluangbersyaratnyaatribut yang digunakansalingbebas (independence). • Sedangkanpada Bayesian Network, variabel input bisasaling dependent tidaksalingbebas (dependence).

  13. Bayesian Networks • A Bayesian network (BN) is a probabilistic graphical model that represents a set of variables and their independencies • Formally, a BN is a directed acyclic graph (DAG) whose nodes represent variables, and whose arcs encode the conditional independencies between the variables

  14. Baynesian Network - Example family-out (fo) bowel-problem (bp) light-on (lo) dog-out (do) hear-bark (hb) From Charniak

  15. Bayesian Networks • “Over the last few years, a method of reasoning using probabilities, variously called belief networks, Bayesian networks, knowledge maps, probabilistic causal networks, and so on, has become popular within the AI community” - from Chaniak • Applications include medical diagnosis, map learning, language understanding, vision, and heuristic search. • In particular, this method is playing an increasingly important role in the design and analysis of machine learning algorithms.

  16. Bayesian Networks Two interpretations • Causal • BNs are used to model situations where causality plays a role, but our understanding is incomplete, so that we must describe things probabilistically • Probabilistic • BNs allow us to calculate the conditional probabilities of the nodes in a network given that some of the values have been observed.

  17. Probabilities in BNs • Specifying the probability distribution for a BN requires: • The prior probabilities of all the root nodes (nodes without parents) • The conditional probabilities of all non-root nodes given all possible combinations of their direct parents • BN representation can yield significant savings in the number of values needed to specify the probability distribution • If variables are binary, then 2n1 values are required for the complete distribution, where n is the number of variables

  18. Probabilities in BNs - Example P(fo) = .15 P(bp) = .01 family-out bowel-problem P(do | fo bp) = .99 P(do | fo ¬bp) = .90 P(do | ¬fo bp) = .97 P(do | ¬fo ¬bp) = .3 light-on dog-out P(lo | fo) = .6 P(lo | ¬fo) = .05 hear-bark P(hb | do) = .7 P(hb | ¬do) = .01 From Charniak

  19. Calculating Probabilities - Example What is the probability that the lights are out? P(lo) = P(lo | fo) P(fo) + P(lo | ¬fo) P(¬fo) = .6 (.15) + .05 (.85) = 0.1325

  20. Calculating Probabilities - Example What is the probability that the dog is out? P(do) = P(do | bpfo) P(bpfo) + P(do | bp ¬fo) P(bp ¬fo) + P(do | ¬bpfo) P(¬bpfo) + P(do | bp ¬fo) P(¬bp¬fo) = P(do | bpfo) P(bp) P(fo) + P(do | bp ¬fo) P(bp) P(¬fo) + P(do | ¬bpfo) P(¬bp) P(fo) + P(do | bp ¬fo) P(¬bp) P(¬fo) = .99(.15)(.01) + .90(.15)(.99) + .97(.85)(.01) + .3(.85)(.99) = 0.4

  21. Linear a Converging Diverging b a c b b a c c Types of Connections in BNs

  22. Independence Assumptions • Linear connection: The two end variables are usually dependent on each other. The middle variable renders them independent. • Converging connection: The two end variables are usually independent of each other. The middle variable renders them dependent. • Divergent connection: The two end variables are usually dependent on each other. The middle variable renders them independent.

  23. Inference in Bayesian Networks • A basic task for BNs is to compute the posterior probability distribution for a set of query variables, given values for some evidence variables. This is called inference or belief updating. • The input to a BN inference evaluation is a set of evidences: e.g., E = { hear-bark = true, lights-on = true } • The outputs of the BN inference evaluation are conditional probabilities P(Xi= v | E) where Xi is a variable in the network.

  24. Inference in Bayesian Networks • Types of inference: • Diagnostic • Causal • Intercausal (Explaining Away) • Mixed

  25. A Bayesian Network Age Gender Exposure to Toxics Smoking Cancer Serum Calcium Lung Tumor

  26. Independence Age and Gender are independent. Age Gender P(A,G) = P(G)P(A) P(A|G) = P(A) A ^G P(G|A) = P(G) G ^A P(A,G) = P(G|A) P(A) = P(G)P(A) P(A,G) = P(A|G) P(G) = P(A)P(G)

  27. Conditional Independence Cancer is independent of Age and Gender given Smoking. Age Gender Smoking P(C|A,G,S) = P(C|S) C ^ A,G | S Cancer

  28. Serum Calcium is independent of Lung Tumor, given Cancer P(L|SC,C) = P(L|C) More Conditional Independence:Naïve Bayes Serum Calcium and Lung Tumor are dependent Cancer Serum Calcium Lung Tumor

  29. P(E = heavy | C = malignant) > P(E = heavy | C = malignant, S=heavy) More Conditional Independence:Explaining Away Exposure to Toxics and Smoking are independent Exposure to Toxics Smoking E ^ S Cancer Exposure to Toxics is dependent on Smoking, given Cancer

  30. Age Gender Exposure to Toxics Smoking Cancer Serum Calcium Lung Tumor Put it all together

  31. Q E Diagnostic Inference • Inferring the probability of a cause based on evidence of an effect • Also known as “bottom up” reasoning

  32. Probabilities in BNs - Example P(fo) = .15 P(bp) = .01 family-out bowel-problem P(do | fo bp) = .99 P(do | fo ¬bp) = .90 P(do | ¬fo bp) = .97 P(do | ¬fo ¬bp) = .3 light-on dog-out P(lo | fo) = .6 P(lo | ¬fo) = .05 hear-bark P(hb | do) = .7 P(hb | ¬do) = .01 From Charniak

  33. family out bowel problem dog out Example: Diagnostic Inference Given that the dog is out, what’s the probability that the family is out? That the dog has a bowel problem? What’s the probable cause of the dog being out?

  34. E Q Causal Inference • Inferring the probability of an effect based on evidence of a cause • Also known as “top down” reasoning

  35. family out bowel problem dog out Example: Causal Inference What is the probability that the dog is out given that the family is out? P(do | fo) = P(do | fobp) P(bp) + P(do | fo ¬bp) P(¬bp) = .99 (.01) + .90 (.99) = 0.90 What is the probability that the dog is out given that he has a bowel problem? P(do | bp) = P(do | bpfo) P(fo) + P(do | bp ¬fo) P(¬fo) = .99 (.15) + .97 (.85) = 0.973

  36. Q E F Intercausal Inference (Explaining Away) • Involves two causes that "compete" to "explain" an effect • The causes become conditionally dependent given that their common effect is observed, even though they are marginally independent.

  37. family out bowel problem dog out Explaining Away - Example What is the probability that the family is out given that the dog is out and has a bowel problem? Evidence of the bowel problem “explains away” the fact that the dog is out.

  38. E Q E Mixed Inference • Combines two or more diagnostic, causal, or intercausal inferences

  39. Predictive Inference Age Gender How likely are elderly males to get malignant cancer? Exposure to Toxics Smoking P(C=malignant| Age>60, Gender= male) Cancer Serum Calcium Lung Tumor

  40. Combined Age Gender How likely is an elderly male patient with high Serum Calciumto have malignant cancer? Exposure to Toxics Smoking Cancer P(C=malignant| Age>60, Gender= male, Serum Calcium = high) Serum Calcium Lung Tumor

  41. Smoking • If we then observe heavy smoking, the probability of exposure to toxics goes back down. Explaining away Age Gender • If we see a lung tumor, the probability of heavy smoking and of exposure to toxics both go up. Exposure to Toxics Smoking Cancer Serum Calcium Lung Tumor

  42. Bayesian Network - Latihan Yeni Herdiyeni

  43. Bayesian Network • Dari gambartersebutdapatdiketahuipeluanggabungandari P(R,W). Jika P(R) = 0.4, maka P(~R) = 0.6 danjika P(~W|~R) = 0.8. • Kaidah Bayes dapatdigunakanuntukmembuatdiagnosa.

  44. Bayesian Network Sebagai contoh jika diketahui bahwa rumput basah, maka peluang hujan dapat dihitung sebagai berikut :

  45. Bayesian Network • Berapa peluang rumput basah jika Springkler menyala (tidak diketahui hujan atau tidak)

  46. Bayesian Network • Berapa peluang Springkler menyala setelah diketahui rumput basah P(S|W)?

  47. Bayesian Network • Jika diketahui hujan, berapa peluang Springkler menyala?

  48. Bayesian Network • Bagaimana jika ada asumsi : Jika cuacanya mendung (cloudy), maka Springkler kemungkinan besar tidak menyala.

  49. Bayesian Network • Berapa peluang rumput basah jika diketahui cloudy?

More Related