1 / 60

Graphical Models In Python | Edureka

(** Graphical Models Certification Training: https://www.edureka.co/graphical-modelling-course **)<br>This Edureka "Graphical Models" PPT answers the question "Why do we need Probabilistic Graphical Models?" and how are they compare to Neural Networks. It takes you through the basics of PGMs and gives real-world examples of its applications.<br><br>Why do you need PGMs?<br>What is a PGM?<br>Bayesian Networks<br>Markov Random Fields<br>Use Cases<br>Bayesian Networks & Markov Random Fields<br>PGMs & Neural Networks<br><br>Follow us to never miss an update in the future.<br>Instagram: https://www.instagram.com/edureka_learning/<br>Facebook: https://www.facebook.com/edurekaIN/<br>Twitter: https://twitter.com/edurekain<br>LinkedIn: https://www.linkedin.com/company/edureka

EdurekaIN
Télécharger la présentation

Graphical Models In Python | Edureka

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Agenda Why do you need PGMs? 01 What is a PGM? 02 Bayesian Networks 03 Markov’s Random Fields 04 Use-Cases 05 Belief Networks in MRF 06 PGMs & Neural Networks 07

  2. Why do you need PGMs?

  3. Probabilistic Graphical Models are rich frameworks for encoding probability distributions over complex domains. 01 Compact Graphical Representation PGM are frameworks used to create and represent compact graphical models of complex real world scenarios Why do you need Probabilistic Graphical Models? 02 Intuitive Diagrams of Complex Relationships PGMs give us intuitive diagrams of complex relationships between stochastic variables. 03 Convenient from Computational Aspect PGMs are also convenient from computational point of view, since we already have algorithms for working with graphs and statistics. Dynamic Simulation of Models Using PGM we can simulate dynamics of industrial establishments, create models, and many other things. 04

  4. What is a PGM?

  5. Consider you have 4 binary(Yes/No) variables. Spot in the World Cup(Yes/No) WC What is a Probabilistic Graphical Model? Performance in the Pre WC Tour (Yes/No) P Good Genetics(Yes/No) G Good Form(Yes/No) F

  6. Consider you have 4 binary(Yes/No) variables. G F What is a Probabilistic Graphical Model? P WC

  7. Components of a Graphical Model G F What is a Probabilistic Graphical Model? P WC Nodes = Random Variables

  8. Components of a Graphical Model What is a Probabilistic Graphical Model? Edges = Inter-Nodal Dependencies

  9. So, What is a PGM? Probabilistic P What is a Probabilistic Graphical Model? G Graphical Models M

  10. So, What is a PGM? P Probabilistic What is a Probabilistic Graphical Model? The nature of the problem that we are generally interested to solve or the type of queries we want to make are all probabilistic because of uncertainty. There are many reasons that contributes to it. G M

  11. So, What is a PGM? Probabilistic P What is a Probabilistic Graphical Model? G Graphical Models M

  12. So, What is a PGM? G Graphical What is a Probabilistic Graphical Model? Graphical representation helps us to visualise better and So, we use Graph Theory to reduce the no of relevant combinations of all the participating variables to represent the high dimensional probability distribution model more compactly. P M

  13. So, What is a PGM? Probabilistic P What is a Probabilistic Graphical Model? G Graphical Models M

  14. So, What is a PGM? Models M What is a Probabilistic Graphical Model? A Model is a declarative representation of a real world scenario or a problem that we want to analysis. It is represented by using any mathematical tools like graph or even simply by an equation. P G

  15. So, What is a PGM? Probabilistic P What is a Probabilistic Graphical Model? G Graphical Models M

  16. So, What is a PGM? P G M What is a Probabilistic Graphical Model? Probabilistic Graphical Models (PGM) is a technique of compactly representing a joint distribution by exploiting dependencies between the random variables. It also allows us to do inference on joint distributions in a computationally cheaper way than the traditional methods.

  17. Probability What is the Probability of A? B B B B B A B A A What is a Probabilistic Graphical Model? C A C A C C C C Solution: • Count all As and Divide it by Total number of Possibilities. • P(A) = (#A)/(#A+#B+#C)

  18. Conditional Probability What is the Probability of A&B? A B A A What is a Probabilistic Graphical Model? B A B B B Solution: • B should occur when A is already happening. • P(A&B) = P(A) * P(B|A) or P(B) * P(A|B).

  19. Joint, Probability and Marginal Distributions • The Joint Probability Distribution describes how two or more variables are distributed simultaneously. To get a probability from the joint distribution of A and B, you would consider P(A=a and B=b). What is a Probabilistic Graphical Model? • The Conditional Probability Distribution looks at how the probabilities of A are distributed, given a certain value, say, for B, P(A=a| B=b). • The Marginal Probability Distribution is one that results from taking mean over one variable to get the probability distribution of the other. For Example, the marginal probability distribution of A when A & B are related would be given by the following; ׬?P(a|b) P(b)db

  20. Bayesian Networks

  21. Bayesian Probability P(One Event | Another Event) A B A A B Bayesian Networks A B B B We have seen earlier: • P(A&B) = P(A) * P(B|A) or P(B) * P(A|B) From here, we could isolate either P(B|A) or P(A|B) and compute from simpler probabilities.

  22. Bayes Theorem A B A A B Bayesian Networks A B B B We have seen earlier: • P(A|B) = P(B|A) * P(A)/P(B) • P(B|A) = P(A|B) * P(B)/P(A)

  23. Bayes Network A Bayes Network is a structure that can be represented as a Direct Acyclic Graph. 1. It allows a compact representation of the distribution from the chain rule of Bayes Networks. 2. It observes conditional independence relationships between random variables. Bayesian Networks F G P A DAG(Direct Acyclic Graph) is a finite directed graph with no directed cycles. WC

  24. Bayes Theorem : Example Genes Good Bad P(Genes) 0.2 0.8 In Form Yes No P(Form) 0.7 0.3 F G Condition Good Genes, Good Form Good Genes, Bad Form Bad Genes, Good Form Bad Genes, Bad Form Bad 0.5 Okay 0.3 Brilliant 0.2 Bayesian Networks P 0.8 0.15 0.05 Condition No Spot 0.95 Spot given 0.05 0.8 0,1 0.1 WC Bad Performance Okay Performance Brilliant Performance 0.9 0.08 0.02 0.8 0.2 0.5 0.5

  25. Bayes Theorem : Example What should you think about? F G • Does a spot in the WC team depend on Genetics? • Does a spot in the WC team depend on Genetics if you know someone is in good form? Bayesian Networks P • Does a spot in the WC team depend on Genetics if you know the performance in the Pre WC Tour? WC

  26. Bayes Theorem : Example How this works. F G • Each node in the Bayes Network will have a CPD associated with it. • If the node has parents, the associated CPD represents P(value| parent’s value) Bayesian Networks P • If a node has no parents, the CPD represents P(value), the unconditional probability of the value. WC

  27. Markov’s Random Fields

  28. Undirected Graphical Models C A B Markov’s Random Fields E D • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E) Probability Distribution of the variables in the graph can factorised as individual clique potential functions.

  29. Cliques C A B Markov’s Random Fields E D • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E) P(X) = 1 ?ς?∈???????(?)ϕ?(??) potential functions

  30. Cliques C A B Markov’s Random Fields E D • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E) P(X) = 1 ?ς?∈???????(?)ϕ?(??) potential functions

  31. Cliques C A B Markov’s Random Fields E D • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E) P(X) = 1 ?ς?∈???????(?)ϕ?(??) potential functions

  32. Cliques C A B Markov’s Random Fields E D • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E) P(X) = 1 ?ς?∈???????(?)ϕ?(??) potential functions

  33. Cliques C A B Markov’s Random Fields E D • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C) ϕ(B,D) ϕ(C,E) ϕ(D,E) P(X) = 1 ?ς?∈???????(?)ϕ?(??) potential functions

  34. Cliques C A B Markov’s Random Fields E D • P(A,B,C,D,E) ∝ ϕ(A,B) ϕ(B,C,D) ϕ(C,D,E) P(X) = 1 ?ς?∈???????(?)ϕ?(??) potential functions

  35. Markov Random Fields C A B Markov’s Random Fields E D • Paths between A and C ; A-B-C A-B-D-E-C

  36. Markov Random Fields C A B Markov’s Random Fields E D • Paths between A and C ; A-B-C A-B-D-E-C

  37. Markov Random Fields C A B Markov’s Random Fields E D • Any two subsets of variables are conditionally independent, given a separating subset. • {B,D},{B,E} & {B,D,E} are the separating subsets.

  38. Use- Cases

  39. Applications of PGMs Google is based on a very simple graph algorithm called page rank. Use Cases Netflix, Amazon, Facebook all use PGMs to recommend what is best for you.

  40. Applications of PGMs FiveThirtyEight is a company that makes predictions about American Presidential Polls using Use Cases PGMs. PGMs can also apparently infer whether one is a Modi Supporter or Kejriwal Supporter.

  41. Bayesian Networks & Markov Random Fields

  42. Bayes Nets as MRFs A B P(A,B) = P(A) * P(B|A) Belief Networks & Markov Random Fields Bayes Network A B P(A,B) ∝ ϕ(A,B) MRF

  43. Bayes Nets as MRFs A C B P(A,B) = P(A)P(B|A)P(C|B) Belief Networks & Markov Random Fields Bayes Network A C B P(A,B) ∝ ϕ(A,B) ϕ(B,C) MRF

  44. Bayes Nets as MRFs : Chains A C B P(A,B,C) = P(A)P(B|A)P(C|B) Belief Networks & Markov Random Fields Bayes Network A C B P(A,B,C) ∝ ϕ(A,B) ϕ(B,C) ϕ(A,B) ← P(A)P(B|A) ϕ(B,C) ← P(C|B) MRF Parameterization is not unique

  45. Bayes Nets as MRFs : Shared Parents A P(A,B,C) = P(A)P(B|A)P(C|B) B C Belief Networks & Markov Random Fields Bayes Network A P(A,B,C) ∝ ϕ(A,B) ϕ(A,C) B C ϕ(A,B) ← P(A)P(B|A) ϕ(A,C) ← P(C|A) MRF

  46. Bayes Nets as MRFs : Shared Child A B P(A,B,C) = P(A)P(B)P(C|A,B) C A and B are dependent given C Belief Networks & Markov Random Fields Bayes Network A B P(A,B,C) ∝ ϕ(A,C) ϕ(B,C) C A and B are independent given C MRF

  47. Converting Bayes Nets to MRFs : Moralizing Parents A B P(A,B,C) ∝ ϕ(A,C) ϕ(B,C) Belief Networks & Markov Random Fields A and B are independent given C C • Moralize all co-parents. • Lose marginal independence of parents directed undirected

  48. PGMs and Neural Networks

  49. The Boyfriend Problem PGMs & Neural Networks

More Related