1 / 19

Distributed Message Passing for Large Scale Graphical Models

Distributed Message Passing for Large Scale Graphical Models. Alexander Schwing Tamir Hazan Marc Pollefeys Raquel Urtasun. CVPR2011. Outline. Introduction Related work Message passing algorithm Distributed convex belief propagation Experiment evaluation Conclusion. Introduction.

shasta
Télécharger la présentation

Distributed Message Passing for Large Scale Graphical Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Distributed Message Passing for Large Scale Graphical Models Alexander Schwing TamirHazan Marc Pollefeys Raquel Urtasun CVPR2011

  2. Outline • Introduction • Related work • Message passing algorithm • Distributed convex belief propagation • Experiment evaluation • Conclusion

  3. Introduction • Vision problems → discrete labeling problems in an undirected graphical model (Ex : MRF) • Belief propagation (BP) • Graph cut • Depending on the potentials and structure of the graph • The main underlying limitations to real-world problems are memory and computation.

  4. Introduction • A new algorithm • distribute and parallelize the computation and memory requirements • conserving the convergence and optimality guarantees • Computation can be done in parallel by partitioning the graph and imposing agreement between the beliefs in the boundaries. • Graph-based optimization program → local optimization problems (one per machine). • Messages between machines : Lagrange multipliers • Stereo reconstruction from high-resolution image • Handle large problems (more than 200 labels in images larger than 10 MPixel)

  5. Related work • Provable convergence while still being computationally tractable. • parallelizes convex belief propagation • conserves its convergence and optimality guarantees • Strandmarkand Kahl[24] • splitting the model across multiple machines • GraphLab • assumes that all the data is stored in shared-memory

  6. Related work • Split the message passing task at hand into several local optimization problems that are solved in parallel. • To ensure convergence we force the local tasks to communicate occasionally. • At the local level we parallelize the message passing algorithm using a greedy vertex coloring

  7. Message passing algorithm • The joint distribution factors into a product of non-negative functions • defines a hypergraphwhose nodes represent the n random variables and the subsets of variables x correspond to its hyperedges. • Hypergraph • Bipartite graph : factor graph[11] • one set of nodes corresponding to the original nodes of the hypergraph : variable nodes • the other set consisting of its hyperedges : factor nodes • N(i) : all factor nodes that are neighbors of variable node i

  8. Message passing algorithm • Maximum a posteriori (MAP) assignment • Reformulate the MAP problem as integer linear program.

  9. Message passing algorithm

  10. Distributed convex belief propagation • Partition the vertices of the graphical model to disjoint subgraphs • each computer solves independently a variational program with respect to its subgraph. • The distributed solutions are then integrated through message-passing between the subgraphs • preserving the consistency of the graphical model. • Properties : • If (5) is strictly concave then the algorithm converges for all ε >= 0, and converges to the global optimum when ε > 0.

  11. Distributed convex belief propagation

  12. Distributed convex belief propagation

  13. Distributed convex belief propagation • Lagrange multipliers : • : the marginalization constraints within each computer • : the consistency constraints between the different computers

  14. Distributed convex belief propagation

  15. Experiment evaluation • Stereo reconstruction • nine 2.4 GHz x64 Quad-Core computers with 24 GB memory each, connected via a standard local area network • libDAI0.2.7 [17] and GraphLAB[16]

  16. Experiment evaluation

  17. Experiment evaluation

  18. Experiment evaluation • relative duality gap

  19. Conclusion • Large scale graphical models by dividing the computation and memory requirements into multiple machines. • Convergence and optimality guarantees are preserved. • Main benefit : the use of multiple computers.

More Related