1 / 18

Asymptotic Behavior of Stochastic Complexity of Complete Bipartite Graph-Type Boltzmann Machines

This study investigates the asymptotic behavior of the stochastic complexity of complete bipartite graph-type Boltzmann machines in the context of Bayes learning. The upper bound of mean field stochastic complexity is derived, and its accuracy of approximation is discussed. Comparison with other studies and potential future works are also presented.

jacobsdavid
Télécharger la présentation

Asymptotic Behavior of Stochastic Complexity of Complete Bipartite Graph-Type Boltzmann Machines

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Asymptotic Behavior of Stochastic Complexity of Complete Bipartite Graph-Type Boltzmann Machines Yu Nishiyama and Sumio Watanabe Tokyo Institute of Technology, Japan

  2. Background Learning machines Information systems Pattern recognition Mixture models Natural language processing Hidden Markov models Gene analysis Bayesian networks mathematically Bayes learning is effective Singular statistical models

  3. Problem : Calculations which include a Bayes posterior require huge computational cost. a trial distribution a Bayes posterior Mean field approximation Accuracy of approximation Stochastic Complexity Difference from regular statistical models Model selection

  4. Asymptotic behavior of mean field stochastic complexities are studied. • Mixture models [ K. Watanabe, et al. 2004. ] • Reduced rank regressions [ Nakajima, et al. 2005. ] • Hidden Markov models [ Hosino, et al. 2005. ] • Stochastic context-free grammar [ Hosino, et al. 2005. ] • Neural networks [ Nakano, et al. 2005. ]

  5. Purpose • We derive the upper bound of mean field stochastic complexity of complete bipartite graph-type Boltzmann machines. Graphical models Boltzmann Machines Spin systems

  6. Table of Contents • Review Bayes Learning Mean Field Approximation Boltzmann Machines ( Complete Bipartite Graph-type ) • Main Theorem Main Theorem Outline of the Proof • Discussion and Conclusion

  7. Bayes Learning True distribution model prior : Bayes posterior : Bayes predictive distribution

  8. Mean Field Approximation (1) The Bayes posterior can be rewritten as . We consider a Kullback distance from a trial distribution to the Bayes posterior .

  9. Mean Field Approximation (2) When we restrict the trial distribution to , which minimizes is called mean field approximation. The minimum value of is called mean field stochastic complexity.

  10. Complete Bipartite Graph-typeBoltzmann Machines units units parametric model takes

  11. True Distribution We assume that the true distribution is included in the parametric model and the number of hidden units is . units True distribution is units

  12. Main Theorem The mean field stochastic complexity of complete bipartite graph-type Boltzmann machines has the following upper bound. : constant :the number of input and output units :the number of hidden units (learning machines) :the number of hidden units (true distribution)

  13. Outline of the Proof (Methods) depends on the BM normal distribution family prior

  14. n o n - z Outline of the Proof [lemma] For Kullback information and , if there exists a value of parameter such that the number of elements of the set mean field stochastic complexity is less than or equal to , following upper bound. has the Hessian matrix e r o

  15. We apply this lemma to the Boltzmann machines. Kullback information is given by . The second order differential is . Here , .

  16. n o n - z The parameter is a true parameter . Then, becomes , . Then, . and hold. By using the lemma, we have e r o .

  17. Discussion Comparison with other studies regular statistical model Stochastic Complexity algebraic geometry derived result [Yamazaki] upper bound upper bound mean field approximation Bayes learning :Number of Training data asymptotic area

  18. Conclusion • We derived the upper bound of mean field stochastic complexity of complete bipartite graph-type Boltzmann Machines. Future works • Lower bound • Comparison with experimental results

More Related