1 / 11

Approximate inference for stochastic dynamics in large biochemical networks

Approximate inference for stochastic dynamics in large biochemical networks. Scientific Kick-off Meeting Torino 03 - 06 February 2013. ESR: Ludovica Bachschmid Romano Supervisor: Prof. Manfred Opper Technische Universität Berlin. Dynamic mean field for infinite ranged Ising models.

livi
Télécharger la présentation

Approximate inference for stochastic dynamics in large biochemical networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Approximate inference for stochastic dynamics in large biochemical networks Scientific Kick-off Meeting Torino 03 - 06 February 2013 ESR: Ludovica Bachschmid Romano Supervisor: Prof. Manfred Opper Technische Universität Berlin

  2. Dynamic mean field for infinite ranged Ising models OUTLINE: • Introduction • Adaptive TAP equations • The Expectation Propagation algorithm • Expectation Propagation for dynamical Ising models

  3. Introduction Spin models with quenched disorder Models with random infinite ranged interactions can be well described by mean-field methods Replica method Quenched average behaviour: random couplings Statics: Specific realization of the disorder: fixed couplings Naive MF, TAP equations TAP equations explicitely depend on the distribution of the couplings Two models which do not assume a specific randomness of the couplings but rather adapts to concrete data are: Adaptive TAP, Expectation Propagation We aim to extend them to the dynamical case.

  4. Adaptive TAP equations For set We can write the joint distribution as Approximating by a gaussian (weak dependencies between RV) we get and

  5. Adaptive TAP equations The expectation of can be written as First set of TAP equations Using the gaussian approximation we get Onsager term The remaining task is to compute, . By definition, it is given by

  6. Adaptive TAP equations The unknown are computed from a linear rensponse relation for the covariance and requiring consistency on the dyagonal. For an Ising model we find

  7. Expectation Propagation (EP) Method for approximating intractable integrals arising in probabilistic inference. Intractable distribution: Approximation: from the exponential family The parameters of are obtained by matching its expected sufficient statistics to the corresponding moments of The EP algorithm iteratively updates the terms by enforcing to share moments with each of the tilted distribution

  8. EP for Ising models Choose a suitable factorization Define the approximating distribution is a full multivariate gaussian density, thus the moment matching conditions will involve only the mean and the variance

  9. Write the tilted distribution Cavity gaussian distribution Matching the mean and the variance of to the mean and the variance of we obtain the same equations we obtained with the adaptive TAP approach: where

  10. Dynamics Model: • N Ising spins; • All spins interact with all other spins; • Weak couplings, no assumptions on the distribution of the couplings; • Parallel update: Idea: We write the cavity field distribution “removing” from the system a whole spin trajectory. We approximate the cavity field by a multivariate gaussian, with time dependent mean and variance.

  11. To be done:Short term: complete the dynamical EP equations, perform numerical simulations.Long term: in the framework of advanced mean field methods, study continous-time stochastic processes. • Bibliography: • T. P. Minka, in UAI 2001, pp 362-369, 2001a. • M. Opper and O. Winther. Neural Computation, 12:2655-2684, 2000. • M. Opper and O. Winther, Advanced mean field methods: theory and practice, edited by M. Opper and D. Saad (MIT Press, Cambrdige, MA, 2001). • M. Opper and O. Winther, Physical Review E 64, 056131 (2001) • D.J. Thouless, P. W. Anderson, and R. G. Palmer, Philosophical Magazine 35, 593 (1977).

More Related