1 / 26

Intelligent Systems (AI-2) Computer Science cpsc422 , Lecture 17 Feb, 14, 2014

Intelligent Systems (AI-2) Computer Science cpsc422 , Lecture 17 Feb, 14, 2014. Slide Sources D. Koller , Stanford CS - Probabilistic Graphical Models D. Page , Whitehead Institute, MIT . Several Figures from

karan
Télécharger la présentation

Intelligent Systems (AI-2) Computer Science cpsc422 , Lecture 17 Feb, 14, 2014

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 17 Feb, 14, 2014 Slide Sources D. Koller, Stanford CS - Probabilistic Graphical Models D. Page, Whitehead Institute, MIT Several Figures from “Probabilistic Graphical Models: Principles and Techniques” D. Koller, N. Friedman 2009 CPSC 422, Lecture 17

  2. Lecture Overview • Probabilistic Graphical models • Intro • Example • Markov Networks Representation (vs. Belief Networks) • Inference in Markov Networks (Exact and Approx.) • Applications of Markov Networks CPSC 422, Lecture 17

  3. Probabilistic Graphical Models From “Probabilistic Graphical Models: Principles and Techniques” D. Koller, N. Friedman 2009 CPSC 422, Lecture 17

  4. Misconception Example • Four students (Alice, Bill, Debbie, Charles) get together in pairs, to work on a homework • But only in the following pairs: AB AD DC BC • Professor misspoke and might have generated misconception • A student might have figured it out later and told study partner CPSC 422, Lecture 17

  5. Example: In/Depencencies • Are A and C independent because they never spoke? • No, because A might have figure it out and told B who then told C • But if we know the values of B and D…. • And if we know the values of A and C CPSC 422, Lecture 17

  6. Which of these two Bnets captures the two independencies of our example? • A. • B. • D. None • C. Both CPSC 422, Lecture 17

  7. Parameterization of Markov Networks • Factors define the local interactions (like CPTs in Bnets) • What about the global model? What do you do with Bnets? X X CPSC 422, Lecture 17

  8. How do we combine local models? • As in BNets by multiplying them! CPSC 422, Lecture 17

  9. Multiplying Factors (same seen in 322 for VarElim) CPSC 422, Lecture 17

  10. Factors do not represent marginal probs. ! • Marginal P(A,B) • Computed from the joint CPSC 422, Lecture 17

  11. Step Back…. From structure to factors/potentials • In a Bnet the joint is factorized…. • In a Markov Network you have one factor for each maximal clique CPSC 422, Lecture 17

  12. Directed vs. Undirected • Independencies • Factorization CPSC 422, Lecture 17

  13. General definitions • Two nodes in a Markov network are independent if and only if every path between them is cut off by evidence • eg for A C • So the markov blanket of a node is…? • eg for C CPSC 422, Lecture 17

  14. Lecture Overview • Probabilistic Graphical models • Intro • Example • Markov Networks Representation (vs. Belief Networks) • Inference in Markov Networks (Exact and Approx.) • Applications of Markov Networks CPSC 422, Lecture 17

  15. Variable eliminationalgorithm for Bnets • To compute P(Z| Y1=v1,… ,Yj=vj) : • Construct a factor for each conditional probability. • Set the observed variables to their observed values. • Given an elimination ordering, simplify/decompose sum of products • Perform products and sum out Zi • Multiply the remaining factors Z • Normalize: divide the resulting factor f(Z) by Z f(Z) . • Variable eliminationalgorithm for Markov Networks….. CPSC 422, Lecture 17

  16. Gibbs sampling for Markov Networks • Example: P(D | ¬c) • Resample non-evidence variables in a pre-defined order or a random order • Suppose we begin with A Note: never change evidence ¬c (i.e., c=0)! • What do we need to sample? • A. P(A | B) • B. P(A | B C) • C. P(B C | A) CPSC 422, Lecture 17

  17. Example: Gibbs sampling • Resample probability distribution of P(A|BC) Normalized result = CPSC 422, Lecture 17

  18. Example: Gibbs sampling • Resample probability distribution of B given A D Normalized result = CPSC 422, Lecture 17

  19. Lecture Overview • Probabilistic Graphical models • Intro • Example • Markov Networks Representation (vs. Belief Networks) • Inference in Markov Networks (Exact and Approx.) • Applications of Markov Networks CPSC 422, Lecture 17

  20. Markov Networks Applications (1): Computer Vision • Called Markov Random Fields • Stereo Reconstruction • Image Segmentation • Object recognition • Typically pairwise MRF • Each vars correspond to a pixel (or superpixel) • Edges (factors) correspond to interactions between adjacent pixels in the image • E.g., in segmentation: from generically penalize discontinuities, to road under car CPSC 422, Lecture 17

  21. Image segmentation CPSC 422, Lecture 17

  22. Markov Networks Applications (2): Sequence Labeling in NLP and BioInformatics • Conditional random fields (next class Wed in two weeks) CPSC 422, Lecture 17

  23. CPSC 422, Lecture 17 Learning Goals for today’s class • You can: • Justify the need for undirected graphical model (Markov Networks) • Interpret local models (factors/potentials) and combine them to express the joint • Define independencies and Markov blanket for Markov Networks • Perform Exact and Approx. Inference in Markov Networks • Describe a few applications of Markov Networks

  24. Next class Midterm, Mon, Feb 24, we will start at 10am sharp • How to prepare…. • Keep Working on assignment-2 ! • Office Hours next week (Wed, Fri 9-10am)x • Learning Goals (look at the end of the slides for each lecture) • Revise all the clicker questions and practice exercises • Will post more practice material next week CPSC 422, Lecture 17

  25. How to acquire factors? CPSC 422, Lecture 17

  26. CPSC 422, Lecture 17

More Related