1 / 17

Belief Propagation on Markov Random Fields

Belief Propagation on Markov Random Fields. Aggeliki Tsoli. Outline. Graphical Models Markov Random Fields (MRFs) Belief Propagation. Graphical Models. Diagrams Nodes : random variables Edges : statistical dependencies among random variables Advantages: Better visualization

vondra
Télécharger la présentation

Belief Propagation on Markov Random Fields

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Belief Propagation on Markov Random Fields Aggeliki Tsoli

  2. Outline • Graphical Models • Markov Random Fields (MRFs) • Belief Propagation MLRG

  3. Graphical Models • Diagrams • Nodes: random variables • Edges: statistical dependencies among random variables • Advantages: • Better visualization • conditional independence properties • new models design • Factorization MLRG

  4. Graphical Models types • Directed • causal relationships • e.g. Bayesian networks • Undirected • no constraints imposed on causality of events (“weak dependencies”) • Markov Random Fields (MRFs) MLRG

  5. Example MRF Application: Image Denoising Noisy image e.g. 10% of noise Original image (Binary) • Question: How can we retrieve the original image given the noisy one? MLRG

  6. MRF formulation • Nodes • For each pixel i, • xi : latent variable (value in original image) • yi : observed variable (value in noisy image) • xi, yi {0,1} y1 y2 x1 x2 yi xi yn xn MLRG

  7. y1 y2 x1 x2 yi xi yn xn MRF formulation • Edges • xi,yi of each pixel i correlated • local evidence function (xi,yi) • E.g. (xi,yi) = 0.9 (if xi = yi) and (xi,yi) = 0.1 otherwise (10% noise) • Neighboring pixels, similar value • compatibility function (xi, xj) MLRG

  8. y1 y2 x1 x2 yi xi yn xn MRF formulation • Question: What are the marginal distributions for xi, i = 1, …,n? P(x1, x2, …, xn) = (1/Z) (ij) (xi, xj) i (xi, yi) MLRG

  9. Belief Propagation • Goal: compute marginals of the latent nodes of underlying graphical model • Attributes: • iterative algorithm • message passing between neighboring latent variables nodes • Question: Can it also be applied to directed graphs? • Answer: Yes, but here we will apply it to MRFs MLRG

  10. Belief Propagation Algorithm • Select random neighboring latent nodes xi, xj • Send message mij from xi to xj • Update belief about marginal distribution at node xj • Go to step 1, until convergence • How is convergence defined? yi yj xi xj mij MLRG

  11. Step 2: Message Passing • Message mij from xi to xj : what node xi thinks about the marginal distribution of xj yi yj N(i)\j xi xj mij(xj) = (xi) (xi, yi)(xi, xj)kN(i)\j mki(xi) • Messages initially uniformly distributed MLRG

  12. Step 3: Belief Update • Belief b(xj): what node xj thinks its marginal distribution is N(j) yj xj b(xj) = k (xj, yj)qN(j)mqj(xj) MLRG

  13. Belief Propagation Algorithm • Select random neighboring latent nodes xi, xj • Send message mij from xi to xj • Update belief about marginal distribution at node xj • Go to step 1, until convergence yi yj xi xj mij MLRG

  14. Example - Compute belief at node 1. 3 m32 Fig. 12 (Yedidia et al.) 1 2 m21 m42 4 MLRG

  15. Does graph topology matter? • BP procedure the same! • Performance • Failure to converge/predict accurate beliefs [Murphy, Weiss, Jordan 1999] • Success at • decoding for error-correcting codes [Frey and Mackay 1998] • computer vision problems where underlying MRF full of loops [Freeman, Pasztor, Carmichael 2000] vs. MLRG

  16. How long does it take? • No explicit reference on paper • My opinion, depends on • nodes of graph • graph topology • Work on improving the running time of BP (for specific applications) • Next time? MLRG

  17. Questions? MLRG

More Related