1 / 72

Extending Expectation Propagation for Graphical Models

Extending Expectation Propagation for Graphical Models. Yuan (Alan) Qi Joint work with Tom Minka. Motivation. Graphical models are widely used in real-world applications, such as wireless communications and bioinformatics.

conroy
Télécharger la présentation

Extending Expectation Propagation for Graphical Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Extending Expectation Propagation for Graphical Models Yuan (Alan) Qi Joint work with Tom Minka

  2. Motivation • Graphical models are widely used in real-world applications, such as wireless communications and bioinformatics. • Inference techniques on graphical models often sacrifice efficiency for accuracy or sacrifice accuracy for efficiency. • Need a method that better balances the trade-off between accuracy and efficiency.

  3. What we want Motivation Current Techniques Error Computational Time

  4. Outline • Background on expectation propagation (EP) • Extending EP on Bayesian networks for dynamic systems • Poisson tracking • Signal detection for wireless communications • Tree-structured EP on loopy graphs • Conclusions and future work

  5. Outline • Background on expectation propagation (EP) • Extending EP on Bayesian networks for dynamic systems • Poisson tracking • Signal detection for wireless communications • Tree-structured EP on loopy graphs • Conclusions and future work

  6. x1 x2 x1 x2 y1 y2 y1 y2 Graphical Models

  7. Inference on Graphical Models • Bayesian inference techniques: • Belief propagation (BP): Kalman filtering /smoothing, forward-backward algorithm • Monte Carlo: Particle filter/smoothers, MCMC • Loopy BP: typically efficient, but not accurate on general loopy graphs • Monte Carlo: accurate, but often not efficient

  8. Expectation Propagation in a Nutshell • Approximate a probability distribution by simpler parametric terms: For directed graphs: For undirected graphs: • Each approximation term lives in an exponential family (e.g. Gaussian)

  9. EP in a Nutshell • The approximate term minimizes the following KL divergence by moment matching: Where the leave-one-out approximation is

  10. Limitations of Plain EP • Can be difficult or expensive to analytically compute the needed moments in order to minimize the desired KL divergence. • Can be expensive to compute and maintain a valid approximation distribution q(x), which is coherent under marginalization. • Tree-structured q(x):

  11. Three Extensions 1. Instead of choosing the approximate term to minimize the following KL divergence: use other criteria. 2. Use numerical approximation to compute moments: Quadrature or Monte Carlo. 3. Allow the tree-structured q(x) to be non-coherentduring the iterations. It only needs to be coherent in the end.

  12. Efficiency vs. Accuracy Loopy BP (Factorized EP) Error Extended EP ? Monte Carlo Computational Time

  13. Outline • Background on expectation propagation (EP) • Extending EP on Bayesian networks for dynamic systems • Poisson tracking • Signal detection for wireless communications • Tree-structured EP on loopy graphs • Conclusions and future work

  14. Object Tracking Guess the position of an object given noisy observations Object

  15. x1 x2 xT y1 y2 yT Bayesian Network e.g. (random walk) want distribution of x’s given y’s

  16. Approximation Factorized and Gaussian in x

  17. Extensions of EP • Instead of matching moments, use any method for approximate filtering. • Examples: statistical linearization, unscented Kalman filter (UKF), mixture of Kalman filters Turn any deterministic filtering method into a smoothing method! All methods can be interpreted as finding linear/Gaussian approximations to original terms. • Use quadrature or Monte Carlo for term approximations

  18. Example: Poisson Tracking • is an integer valued Poisson variate with mean

  19. Accuracy/Efficiency Tradeoff

  20. y1 yT y2 s1 s2 sT xT x2 x1 Bayesian network for Wireless Signal Detection si: Transmitted signals xi: Channel coefficients for digital wireless communications yi: Received noisy observations

  21. Extended-EP Joint Signal Detection and Channel Estimation • Turn mixture of Kalman filters into a smoothing method • Smoothing over the last observations • Observations before act as prior for the current estimation

  22. Computational Complexity • Expectation propagation O(nLd2) • Stochastic mixture of Kalman filters O(LMd2) • Rao-blackwised particle smoothersO(LMNd2) • n: Number of EP iterations (Typically, 4 or 5) • d: Dimension of the parameter vector • L: Smooth window length • M: Number of samples in filtering (Often larger than 500) • N: Number of samples in smoothing (Larger than 50) • EP is about 5,000 times faster than Rao-blackwised particle smoothers.

  23. Experimental Results (Chen, Wang, Liu 2000) Signal-Noise-Ratio Signal-Noise-Ratio EP outperforms particle smoothers in efficiency with comparable accuracy.

  24. e2 eT e1 y1 yT y2 x1 xT x2 Bayesian Networks for Adaptive Decoding The information bits et are coded by a convolutional error-correcting encoder.

  25. EP Outperforms Viterbi Decoding Signal-Noise-Ratio

  26. Outline • Background on expectation propagation (EP) • Extending EP on Bayesian networks for dynamic systems • Poisson tracking • Signal detection for wireless communications • Tree-structured EP on loopy graphs • Conclusions and future work

  27. X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 Inference on Loopy Graphs Problem: estimate marginal distributions of the variables indexed by the nodes in a loopy graph, e.g., p(xi), i = 1, . . . , 16.

  28. 4-node Loopy Graph Joint distribution is product of pairwise potentials for all edges: Want to approximate by a simpler distribution

  29. BP vs. TreeEP TreeEP BP

  30. Junction Tree Representation p(x) q(x) Junction tree p(x) q(x) Junction tree

  31. Two Kinds of Edges • On-tree edges, e.g., (x1,x4):exactly incorporated into the junction tree • Off-tree edges, e.g., (x1,x2): approximated by projecting them onto the tree structure

  32. KL Minimization • KL minimization moment matching • Match single and pairwise marginals of • Reduces to exact inference on single loops • Use cutset conditioning and

  33. x5 x7 x5 x7 x1 x2 x1 x2 x1 x3 x1 x4 x3 x6 x3 x5 x1 x3 x3 x5 x3 x6 x1 x4 x1 x2 x1 x3 x1 x4 x3 x5 x3 x4 x5 x7 x3 x6 x1 x2 x1 x2 x1 x3 x1 x4 x1 x3 x1 x4 x3 x5 x3 x5 x5 x7 x3 x6 x5 x7 x3 x6 x6 x7 Matching Marginals on Graph (1) Incorporate edge (x3 x4) (2) Incorporate edge (x6 x7)

  34. Drawbacks of Global Propagation • Update all the cliques even when only incorporating one off-tree edge • Computationally expensive • Store each off-tree data message as a whole tree • Require large memory size

  35. Solution: Local Propagation • Allow q(x) be non-coherentduring the iterations. It only needs to be coherent in the end. • Exploit the junction tree representation: only locallypropagate information within the minimal loop (subtree) that is directly connected to the off-tree edge. • Reduce computational complexity • Save memory

  36. x1 x2 x5 x7 x1 x2 x1 x2 x5 x7 x5 x7 x1 x2 x1 x3 x1 x4 x3 x5 x1 x3 x3 x6 x3 x6 x3 x5 x1 x3 x1 x4 x1 x4 x3 x5 x3 x6 x1 x3 x1 x4 x3 x4 x3 x5 x3 x5 x5 x7 x3 x6 x6 x7 (1) Incorporate edge(x3 x4) (2) Propagate evidence On this simple graph, local propagation runs roughly 2 times faster and uses 2 times less memory to store messages than plain EP (3) Incorporate edge (x6 x7)

  37. New Interpretation of TreeEP • Marry EP with Junction algorithm • Can perform efficiently over hypertrees and hypernodes

  38. 4-node Graph • TreeEP = the proposed method • GBP = generalized belief propagation on triangles • TreeVB = variational tree • BP = loopy belief propagation = Factorized EP • MF = mean-field

  39. Fully-connected graphs • Results are averaged over 10 graphs with randomly generated potentials • TreeEP performs the same or better than all other methods in both accuracy and efficiency!

  40. 8x8 grids, 10 trials

  41. TreeEP versus BP and GBP • TreeEP is always more accurate than BP and is often faster • TreeEP is much more efficient than GBP and more accurate on some problems • TreeEP converges more often than BP and GBP

  42. Outline • Background on expectation propagation (EP) • Extending EP on Bayesian networks for dynamic systems • Poisson tracking • Signal detection for wireless communications • Tree-structured EP on loopy graphs • Conclusions and future work

  43. Conclusions • Extend EP on graphical models: • Instead of minimizing KL divergence, use other sensible criteria to generate messages. Effectively turn any deterministic filtering method into a smoothing method. • Use quadrature to approximate messages. • Local propagation to save the computation and memory in tree structured EP.

  44. Extended EP Conclusions State-of-art Techniques • Extended EP algorithms outperform state-of-art inference methods on graphical models in the trade-off between accuracy and efficiency Error Computational Time

  45. Future Work • More extensions of EP: • How to choose a sensible approximation family (e.g. which tree structure) • More flexible approximation: mixture of EP? • Error bound? • Bayesian conditional random fields • EP for optimization (generalize max-product) • More real-world applications

  46. Classifying Colon Cancer Data • The task: distinguish normal and cancer samples • The dataset: 22 normal and 40 cancer samples with 2000 features per sample. • The dataset was randomly split 100 times into 50 training and 12 testing samples. • SVM results from Li et al. 2002

  47. End Contact information: yuanqi@media.mit.edu

  48. Extended EP Accuracy Improves Significantly in only a Few Iterations

  49. EP versus BP • EP approximation is in a restricted family, e.g. Gaussian • EP approximation does not have to be factorized • EP applies to many more problems • e.g. mixture of discrete/continuous variables

  50. EP versus Monte Carlo • Monte Carlo is general but expensive • EP exploits underlying simplicity of the problem if it exists • Monte Carlo is still needed for complex problems (e.g. large isolated peaks) • Trick is to know what problem you have

More Related