1 / 12

Probability Theory

Probability Theory. Joint probability p ( X,Y ) = p( X | Y)p(Y) = p( Y | X) p(X) Bayes’ Theorem p( X | Y) = p(X,Y) / p(Y) = p(Y | X)p(X) / p(Y) Independence p(X,Y) = p(X)p(Y) p(X) = p(X | Y). Serial. A. B. C. p(A, B, C)= p(A) p(B | A) p(C | B). A. Diverging. B. C. D.

marlow
Télécharger la présentation

Probability Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Probability Theory Joint probability p(X,Y) = p(X | Y)p(Y) = p(Y | X)p(X) Bayes’ Theorem p(X | Y) = p(X,Y) / p(Y) = p(Y | X)p(X) / p(Y) Independence p(X,Y) = p(X)p(Y) p(X) = p(X | Y)

  2. Serial A B C p(A, B, C)= p(A) p(B | A) p(C | B) A Diverging B C D p(A,B,C,E) = p(A) p(B | A) p(C | A) p(D | A) Converging B C D A p(A,B,C,E) = p(B) p(C) p(D) p(A | B,C,D)

  3. Model Assumptions Pregnant Blood Test Urine Test p(Pr,BT,UT) = p(Pr)p(BT | Pr)p(UT | Pr) Problem assumes BT & UT are independent if Pr is known Pr(UT=no | Pr = yes) with not be effected if BT= no & Pr=yes

  4. Introduce Variable Pregnant Hormone Blood Test Urine Test p(Pr,Ho,BT,UT) = p(Pr)p(Ho | Pr) p(BT | Ho)p(UT | Ho) Introduces hormone that models relationship between tests

  5. Undirected Variables B C D E

  6. Constraint Variables B C D E contraint Constraint=y Artificial contraint variable where info is “known” which excludes certain combinations (sock example)

  7. Many Variables B C D E A Problems p(A | B, C, D, E) is huge Possible Solution: Divorcing

  8. Divorcing B C D E BC DE A If multiple combinations of B & C are similar with respect to A subsets with respect to influencing A then make BC variable with th

  9. t1 t2 t3 t4 t5 r1 r2 r3 r4 r5 2 symbol language (a,b), 5 letter word, t is real letter r is transmission received P(t1,t2,t3,t4,t5,r1,r2,r3,r4,r5) = pr(t1)p(r1|t1)p(t2|t1)…. Only takes into account p(tt | ti-1)

  10. Word r1 r2 r3 r4 r5 p(Word,r1,r2,r3,r4,r5) = p(Word)p(r1|word)p(r2|word)p(r3|word) p(r4|word)p(r5|word) Word would have all 32 5-letter word combinations

  11. Hidden Markov t1 t2 t3 t4 t5 r1 r2 r3 r4 r5 Violates assumption of hidden markov t1 t2 t3 t4 t5 r1 r2 r3 r4 r5

  12. Kalman Filter Hidden markov model where only 1 variable outside the time slice t1 t2 t3 t4 t5 r1 r2 r3 r4 r5 Markov Chain: Only 1 variable inside the time slice t1 t2 t3 t4 t5 Hidden markov can be transformed into markov chain by taking the cross product of all variable within each time slice

More Related