1 / 50

Bayesian Inference

Bayesian Inference. Lee Harrison York Neuroimaging Centre 14 / 05 / 2010. Posterior probability maps (PPMs). Spatial priors on activation extent. Bayesian segmentation and normalisation. Dynamic Causal Modelling. realignment. smoothing. general linear model. Gaussian field theory.

vargo
Télécharger la présentation

Bayesian Inference

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayesian Inference Lee Harrison York Neuroimaging Centre 14 / 05 / 2010

  2. Posterior probability maps (PPMs) Spatial priors on activation extent Bayesian segmentation and normalisation Dynamic Causal Modelling realignment smoothing general linear model Gaussian field theory statistical inference normalisation p <0.05 template

  3. Attention to Motion Paradigm Results SPC V3A V5+ Attention – No attention Büchel & Friston 1997, Cereb. Cortex Büchel et al.1998, Brain • - fixation only • observe static dots + photic  V1 • - observe moving dots + motion  V5 • task on moving dots + attention  V5 + parietal cortex

  4. SPC V1 V5 Attention Photic Photic SPC V1 V5 Motion Motion Attention Attention to Motion Paradigm Dynamic Causal Models Model 1 (forward):attentional modulationof V1→V5: forward Model 2 (backward):attentional modulationof SPC→V5: backward • - fixation only • observe static dots • - observe moving dots • task on moving dots Bayesian model selection: Which model is optimal?

  5. Responses to Uncertainty Long term memory Short term memory

  6. 1 2 3 4 … 1 2 40 trials Responses to Uncertainty Paradigm Stimuli sequence of randomly sampled discrete events Model simple computational model of an observers response to uncertainty based on the number of past events (extent of memory) Question which regions are best explained by short / long term memory model? ? ?

  7. Overview • Introductory remarks • Some probability densities/distributions • Probabilistic (generative) models • Bayesian inference • A simple example – Bayesian linear regression • SPM applications • Segmentation • Dynamic causal modeling • Spatial models of fMRI time series

  8. k=2 Probability distributions and densities

  9. k=2 Probability distributions and densities

  10. k=2 Probability distributions and densities

  11. k=2 Probability distributions and densities

  12. k=2 Probability distributions and densities

  13. k=2 Probability distributions and densities

  14. k=2 Probability distributions and densities

  15. estimation space space time Generative models generation

  16. Bayesian statistics new data prior knowledge posterior  likelihood ∙ prior The “posterior” probability of the parameters given the data is an optimal combination of prior knowledge and new data, weighted by their relative precision. Bayes theorem allows one to formally incorporate prior knowledge into computing statistical probabilities.

  17. Bayes’ rule Given data y and parameters , their joint probability can be written in 2 ways: Eliminating p(y,) gives Bayes’ rule: Likelihood Prior Posterior Evidence

  18. Principles of Bayesian inference • Formulation of a generative model likelihoodp(y|) prior distribution p() • Observation of data y • Update of beliefs based upon observations, given a prior state of knowledge

  19. Univariate Gaussian Normal densities Posterior mean = precision-weighted combination of prior mean and data mean

  20. Bayesian GLM: univariate case Normal densities

  21. β2 β1 Bayesian GLM: multivariate case Normal densities One step if Ce and Cp are known. Otherwise iterative estimation.

  22. True posterior mean-field approximation iteratively improve Approximate posterior free energy Objective function Value of parameter Approximate inference: optimization

  23. Ordinary least squares Simple example – linear regression Data

  24. Ordinary least squares Bases (explanatory variables) Simple example – linear regression Data and model fit Bases (explanatory variables) Sum of squared errors

  25. Ordinary least squares Simple example – linear regression Data and model fit Bases (explanatory variables) Sum of squared errors

  26. Ordinary least squares Simple example – linear regression Data and model fit Bases (explanatory variables) Sum of squared errors

  27. Simple example – linear regression Data and model fit Ordinary least squares Over-fitting: model fits noise Inadequate cost function: blind to overly complex models Solution: include uncertainty in model parameters Bases (explanatory variables) Sum of squared errors

  28. Bayesian linear regression:priors and likelihood Model:

  29. Bayesian linear regression:priors and likelihood Model: Prior:

  30. Bayesian linear regression:priors and likelihood Model: Prior: Sample curves from prior (before observing any data) Mean curve

  31. Bayesian linear regression:priors and likelihood Model: Prior: Likelihood:

  32. Bayesian linear regression:priors and likelihood Model: Prior: Likelihood:

  33. Bayesian linear regression: priors and likelihood Model: Prior: Likelihood:

  34. Bayesian linear regression: posterior Model: Prior: Likelihood: Bayes Rule:

  35. Bayesian linear regression: posterior Model: Prior: Likelihood: Bayes Rule: Posterior:

  36. Bayesian linear regression: posterior Model: Prior: Likelihood: Bayes Rule: Posterior:

  37. Bayesian linear regression: posterior Model: Prior: Likelihood: Bayes Rule: Posterior:

  38. Posterior Probability Maps (PPMs) Posterior distribution:probability of the effect given the data mean: size of effectprecision: variability Posterior probability map: images of the probability (confidence) that an activation exceeds some specified thresholdsth, given the data y • Two thresholds: • activation threshold sth : percentage of whole brain mean signal (physiologically relevant size of effect) • probability pth that voxels must exceed to be displayed (e.g. 95%)

  39. Bayesian linear regression: model selection Bayes Rule: normalizing constant Model evidence:

  40. PPM of belonging to… grey matter white matter CSF aMRI segmentation

  41. Dynamic Causal Modelling:generative model for fMRI and ERPs Hemodynamicforward model:neural activityBOLD Electric/magnetic forward model:neural activityEEGMEG LFP Neural state equation: fMRI ERPs Neural model: 1 state variable per region bilinear state equation no propagation delays Neural model: 8 state variables per region nonlinear state equation propagation delays inputs

  42. attention estimated effective synaptic strengths for best model (m4) 15 0.10 PPC 0.39 0.26 10 1.25 0.26 stim V1 V5 0.13 5 models marginal likelihood 0.46 0 m1 m2 m3 m4 Bayesian Model Selection for fMRI m1 m2 m3 m4 attention attention attention attention PPC PPC PPC PPC stim V1 stim V1 stim V1 stim V1 V5 V5 V5 V5 [Stephan et al., Neuroimage, 2008]

  43. degree of smoothness Spatial precision matrix Smooth Y(RFT) prior precision of GLM coeff prior precision of AR coeff aMRI prior precision of data noise GLM coeff AR coeff (correlated noise) ML estimate of β VB estimate of β observations Penny et al 2005 fMRI time series analysis with spatial priors

  44. Display only voxels that exceed e.g. 95% activation threshold Probability mass pn PPM (spmP_*.img) Posterior density q(βn) probability of getting an effect, given the data mean: size of effectcovariance: uncertainty fMRI time series analysis with spatial priors:posterior probability maps Mean (Cbeta_*.img) Std dev (SDbeta_*.img)

  45. Log-evidence maps subject 1 model 1 subject N model K Compute log-evidence for each model/subject fMRI time series analysis with spatial priors:Bayesian model selection

  46. Log-evidence maps BMS maps subject 1 model 1 subject N PPM model K EPM Probability that model k generated data model k Compute log-evidence for each model/subject fMRI time series analysis with spatial priors:Bayesian model selection Joao et al, 2009

  47. Reminder… Long term memory Short term memory

  48. Compare two models long-term memory model Short-term memory model IT indices: H,h,I,i IT indices are smoother Missed trials onsets H=entropy; h=surprise; I=mutual information; i=mutual surprise

  49. Group data: Bayesian Model Selection maps Regions best explained by short-term memory model Regions best explained by long-term memory model frontal cortex (executive control) primary visual cortex

  50. Thank-you

More Related