1 / 29

DCM – the theory

Learn about Bayesian inference in Dynamic Causal Modeling (DCM), including choosing the best model and group analysis. Explore classical inference and Bayesian inference in DCM for testing null hypotheses and calculating posterior probabilities. Understand the Bayes rule and how to estimate the most probable underlying model based on observed data.

paigew
Télécharger la présentation

DCM – the theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DCM – the theory

  2. Bayseian inference • DCM examples • Choosing the best model • Group analysis

  3. Bayseian Inference • Classical inference – tests null hypothesis Is the effect significantly different from zero? Or in spm terms, is any activation seen due effect of regressor rather than random noise! • Bayseian inference – probability that activation exceeds a set threshold given data Derived from posterior probability (calculated using Bayes) No false positives (no need for correction!)

  4. Bayes rule • If A and B are 2 separate but possibly dependent random events, then: • Prob of A and B occurring together = P[(A,B)] • The conditional prob of A, given that B occurs = P[(A|B)] • The conditional prob of B, given that A occurs = P[(B|A)] P[(A,B)] = P[(A|B)] P[B] P[(B|A)] P[A] (1) • Dividing the right-hand pair of expressions by P[B] gives Bayes rule: P[(A|B)] = P[(B|A)] P[A] P[B] (2) • In probabilistic inference, we try to estimate the most probable underlying model for a random process, based on observed data. If A represents a given set of model parameters, and B represents the set of observed data values, then: • P[A] is the prior prob of the model A (in the absence of any evidence); • P[B] is the prob of the evidence B; • P[B|A] is the likelihood that the evidence B was produced, given that the model was A; • P[A|B] is the posterior prob of the model being A, given that the evidence is B. Posterior Probability αLikelihood x Prior Probability

  5. Bayes rule 2 In DCM • Likelihood derived from error and confounds (eg. drift) • Priors – empirical (haemodynamic parameters) and non-empirical (eg. shrinkage priors, temporal scaling) • Posterior probability for each effect calculated and probability that it exceeds a set threshold expressed as a percentage

  6. SPM{F} A2 A1 WA An example

  7. Stimulus (perturbation), u1 Set (context), u2 A2 . A1 . WA

  8. Stimulus (perturbation), u1 Set (context), u2 A2 . A1 . WA Full intrinsic connectivity: a

  9. Stimulus (perturbation), u1 Set (context), u2 A2 . A1 . WA Full intrinsic connectivity: a u1 activates A1: c

  10. Stimulus (perturbation), u1 Set (context), u2 A2 A1 . WA Full intrinsic connectivity: a u1 may modulate self connections induced connectivities: b1 u1 activates A1: c

  11. Stimulus (perturbation), u1 Set (context), u2 A2 A1 . WA Full intrinsic connectivity: a u1 may modulate self connections induced connectivities: b1 u2 may modulate anything  induced connectivities: b2 u1 activates A1: c

  12. A2 -.62 (99%) .92 (100%) .37 (100%) A1 .47 (98%) .38 (94%) .37 (91%) WA -.51 (99%) u1 u2

  13. u1 A2 .92 (100%) A1 .47 (98%) u2 .38 (94%) WA Intrinsic connectivity: a

  14. u1 A2 .92 (100%) .37 (100%) A1 .47 (98%) u2 .38 (94%) WA Intrinsic connectivity: a Extrinsic influence: c

  15. u1 A2 -.62(99%) .92 (100%) .37 (100%) A1 .47 (98%) u2 .38 (94%) WA -.51 (99%) Intrinsic connectivity: a Connectivity induced by u1: b1 Extrinsic influence: c

  16. u1 saturation A2 -.62 (99%) .92 (100%) .37 (100%) A1 .47 (98%) u2 .38 (94%) WA -.51 (99%) Intrinsic connectivity: a Connectivity induced by u1: b1 Extrinsic influence: c

  17. u1 saturation A2 -.62 (99%) .92 (100%) .37 (100%) A1 .47 (98%) u2 .38 (94%) .37 (91%) WA -.51 (99%) Intrinsic connectivity: a Connectivity induced by u1: b1 Connectivity induced by u2: b2 Extrinsic influence: c

  18. u1 saturation A2 -.62 (99%) .92 (100%) .37 (100%) A1 .47 (98%) u2 .38 (94%) .37 (91%) adaptation WA -.51 (99%) Intrinsic connectivity: a Connectivity induced by u1: b1 Connectivity induced by u2: b2 Extrinsic influence: c

  19. Another example Design: moving dots (u1), attention(u2)

  20. Another example Design: moving dots (u1), attention(u2) SPM analysis: V1, V5, SPC, IFG

  21. Another example Design: moving dots (u1), attention(u2) SPM analysis: V1, V5, SPC, IFG Literature: V5 motion-sensitive

  22. Another example Design: moving dots (u1), attention(u2) SPM analysis: V1, V5, SPC, IFG Literature: V5 motion-sensitive Previous connect. analyses: SPC mod. V5, IFG mod. SPC

  23. Another example • Design: moving dots (u1), attention(u2) • SPM analysis: V1, V5, SPC, IFG • Literature: V5 motion-sensitive • Previous connect. analyses: SPC mod. V5, IFG mod. SPC • Constraints: - intrinsic connectivity: V1 V5 SPC IFG - u1 V1 - u2: modulates V1 V5 SPC IFG - u3: motion modulates V1 V5 SPC IFG

  24. Another example • Design: moving dots (u1), attention(u2) • SPM analysis: V1, V5, SPC, IFG • Literature: V5 motion-sensitive • Previous connect. analyses: SPC mod. V5, IFG mod. SPC • Constraints: - intrinsic connectivity: V1 V5 SPC IFG - u1 V1 - u2: modulates V1 V5 SPC IFG - u3: motion modulates V1 V5 SPC IFG (photic)

  25. SPC V1 IFG V5 Another example Photic (u1) Attention (u2) .52 (98%) .37 (90%) .42 (100%) .82 (100%) .56 (99%) .47 (100%) .69 (100%) Motion (u3) .65 (100%)

  26. SPC SPC V1 V1 V5 V5 Comparisonof models Model 1:attentional modulationof V1→V5 Model 2:attentional modulationof SPC→V5 Model 3:attentional modulationof V1→V5 and SPC→V5 Attention Attention Photic Photic Photic SPC 0.55 0.03 0.85 0.86 0.85 0.70 0.75 0.70 0.84 1.36 1.42 1.36 0.89 0.85 V1 -0.02 -0.02 -0.02 0.56 0.57 0.57 V5 Motion Motion Motion 0.23 0.23 Attention Attention Bayesian model selection: Model 1 better than model 2, model 1 and model 3 equal → Decision for model 1: in this instance, attention primarily modulates V1→V5

  27. Comparisonof models • Bayseian inference again • Depends on goodness of fit and complexity of various models

  28. Inference about DCM parameters:group analysis • In analogy to “random effects” analyses in SPM, 2nd level analyses can be applied to DCM parameters: Separate fitting of identical models for each subject Selection of bilinear parameters of interest one-sample t-test:parameter > 0 ? paired t-test:parameter 1 > parameter 2 ? rmANOVA:e.g. in case of multiple sessions per subject

  29. "Laughing is a celebration of the good, and it's also how we deal with the bad. Laughing, like crying, is a good way of eliminating toxins from the body. Since the mind and body are connected, you use an amazing amount of muscles when you laugh." http://www.balloonhat.com/

More Related