1 / 1

Particle-based Variational Inference for Continuous Systems

Particle-based Variational Inference for Continuous Systems Alexander Ihler Andrew Frank Padhraic Smyth Department of Computer Science, University of California, Irvine. Summary. Experimental Results (contd.).

miya
Télécharger la présentation

Particle-based Variational Inference for Continuous Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Particle-based Variational Inference for Continuous Systems Alexander Ihler Andrew Frank Padhraic Smyth Department of Computer Science, University of California, Irvine Summary Experimental Results (contd.) • Recent advances in inference for discrete systems have led to algorithms that offer: • Guaranteed convergence • Provably more accurate results • Bounds on the partition function. • We extend these techniques to work in continuous systems by incorporating them into the particle belief propagation framework. Continuous grid marginal estimation MF-PBP Marginal PBP Marginal TRW-PBP Marginal Actual Estimated Weak pairwise potentials: (2) Importance-reweighted discrete inference f(xs, xt) Extensions of Particle BP • Sample • proposals. (3) Adjust proposals. (1) (3) Handling Continuity Xs Xt Tree-reweighted PBP2: Discretization: Strong pairwise potentials: Bounding the partition function • “Inclusive” marginal estimation. • Convex cost function. • Provides an upper bound on the partition function. y Works well for some problems, but…. • Mean field PBP always converges to a lower bound. • Tree-reweighted PBP converges to an upper bound with enough particles. • With fewer than 30 particles, the “upper bound” may be below the true value. • Variance decreases as the number of particles increases. • Very fine discretizations may be required to achieve desired accuracy. • Discrete domain size is exponential in the number of continuous dimensions. x Mean-field PBP: Parametric Assumptions: Some methods restrict beliefs to simple parametric forms (Gaussian BP, expectation propagation). • “Exclusive” marginal estimation. • Non-convex cost function. • Provides a lower bound on the partition function. Model 2:Sensor Localization: • Effective when true beliefs match the parametric form. • Beliefs may be hard to capture parametrically. • Sensors have noisy pairwise distance measurements of each other. • “Anchor” nodes have known locations. • Task: compute a marginal distribution over • the “target” node’s location. ≠ Many possible extensions: • Fractional BP (a generalization of tree-reweighted BP). • Expectation propagation (minimizes a “local” alpha divergence). • Any other message-passing style discrete inference algorithm… • Choose an algorithm based on desired performance characteristics. PBP Run 2 Exact PBP Run 1 TRW-PBP True belief Gaussian approximation Particle Belief Propagation1: Experimental Results A non-parametric adaptive discritization approach. Algorithm overview: The true marginals show two highly probably regions. PBP finds each of them in separate runs, but never both together. TRW-PBP finds both regions, but is less certain. Model 1: Continuous Attractive Grid: Draw samples from proposal distributions over each variable’s domain. Run importance-reweighted belief propagation. Adjust proposals according to partial inference results. Repeat. Pairwise Potentials Local Potentials This toy problem is the continuous analog to the Ising grid. Message update equation: References A. Ihler and D. McAllester. Particle belief propagation. In AI & Statistics: JMLR W&CP, volume 5, pages 256–263, April 2009. M. Wainwright, T. Jaakkola, and A. Willsky. A new class of upper bounds on the log partition function. IEEE Trans. Info. Theory, 51(7):2313–2335, July 2005. 1 0 0 1 ||xs – xt|| xs

More Related