1 / 59

Bayesian Statistics Without Tears: Prelude

Bayesian Statistics Without Tears: Prelude. Eric-Jan Wagenmakers. Three Schools of Statistical Inference. Neyman-Pearson : α -level, power calculations, two hypotheses, guide for action (i.e., what to do). Fisher : p -values, one hypothesis (i.e., H 0 ), quantifies evidence against H 0 .

Télécharger la présentation

Bayesian Statistics Without Tears: Prelude

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayesian Statistics Without Tears: Prelude Eric-Jan Wagenmakers

  2. Three Schools of Statistical Inference • Neyman-Pearson: α-level, power calculations, two hypotheses, guide for action (i.e., what to do). • Fisher: p-values, one hypothesis (i.e., H0), quantifies evidence against H0. • Bayes: prior and posterior distributions, attaches probabilities to parameters and hypotheses.

  3. A Freudian Analogy • Neyman-Pearson: The Superego. • Fisher: The Ego. • Bayes: The Id. Claim: What Id really wants is to attach probabilities to hypotheses and parameters. This wish is suppressed by the Superego and the Ego. The result is unconscious internal conflict.

  4. Internal Conflict Causes Misinterpretations • p < .05 means that H0 is unlikely to be true, and can be rejected. • p > .10 means that H0 is likely to be true. • For a given parameter μ, a 95% confidence interval from, say, a to b means that there is a 95% chance that μ lies in between a and b.

  5. Two Ways to Resolve the Internal Conflict • Strengthen Superego and Ego by teaching the standard statistical methodology more rigorously. Suppress Id even more! • Give Id what it wants.

  6. What is Bayesian Inference?Why be Bayesian? Eric-Jan Wagenmakers

  7. What is Bayesian Inference?

  8. What is Bayesian Inference? “Common sense expressed in numbers”

  9. What is Bayesian Inference? “The means by which rational agents draw optimal conclusions in an uncertain environment”

  10. What is Bayesian Inference? “The only statistical procedure that is coherent, meaning that it avoids statements that are internally inconsistent.”

  11. What is Bayesian Inference? “A method for rational updating of beliefs about the world”

  12. What is Bayesian Inference? “The only good statistics”

  13. Outline • Bayes in a Nutshell • The Inevitability of Probability • Bayesian Revolutions • This Course

  14. Bayesian Inferencein a Nutshell • In Bayesian inference, uncertainty or degree of belief is quantified by probability. • Prior beliefs are updated by means of the data to yield posterior beliefs.

  15. Bayesian Parameter Estimation: Example • We prepare for you a series of 10 factual true/false questions of equal difficulty. • You answer 9 out of 10 questions correctly. • What is your latent probability θof answering any one question correctly?

  16. Bayesian Parameter Estimation: Example • We start with a prior distribution for θ. This reflect all we know about θ prior to the experiment. Here we make a standard choice and assume that all values of θ are equally likely a priori.

  17. Bayesian Parameter Estimation: Example • We then update the prior distribution by means of the data (technically, the likelihood)to arrive at a posterior distribution. • The posterior distribution is a compromise between what we knew before the experiment and what we have learned from the experiment. The posterior distribution reflects all that we know about θ.

  18. Mode = 0.9 95% confidence interval: (0.59, 0.98) NB. We do not have to use the uniform prior!

  19. Outline • Bayes in a Nutshell • The Inevitability of Probability • Bayesian Revolutions • This Course

  20. The Inevitability of Probability • Why would one measure “degree of belief” by means of probability? Couldn’t we choose something else that makes sense? • Yes, perhaps we can, but the choice of probability is anything but ad-hoc.

  21. The Inevitability of Probability • Assume “degree of belief” can be measured by a single number. • Assume you are rational, that is, not self-contradictory or “obviously silly”. • Then degree of belief can be shown to follow the same rules as the probability calculus.

  22. The Inevitability of Probability • For instance, a rational agent would not hold intransitive beliefs, such as:

  23. The Inevitability of Probability • When you use a single number to measure uncertainty or quantify evidence, and these numbers do not follow the rules of probability calculus, you can (almost certainly?) be shown to be silly or incoherent. • One of the theoretical attractions of the Bayesian paradigm is that it ensures coherence right from the start.

  24. Coherence I • Coherence is also key in de Finetti’s conceptualization of probability.

  25. Coherence II • One aspect of coherence is that “today’s posterior is tomorrow’s prior”. • Suppose we have exchangeable (iid) data x = {x1, x2}. Now we can update our prior using x, using first x1 and then x2, or using first x2 and then x1. • All the procedures will result in exactly the same posterior distribution.

  26. Coherence III • Assume we have three models: M1, M2, M3. • After seeing the data, suppose that M1 is 3 times more plausible than M2, and M2 is 4 times more plausible than M3. • By transitivity, M1 is 3x4=12 times more plausible than M3.

  27. Outline • Bayes in a Nutshell • The Inevitability of Probability • Bayesian Revolutions • This Course

  28. The Bayesian Revolution • Until about 1990, Bayesian statistics could only be applied to a select subset of very simple models. • Only recently, Bayesian statistics has undergone a transformation; With current numerical techniques, Bayesian models are “limited only by the user’s imagination.”

  29. The Bayesian Revolutionin Statistics

  30. The Bayesian Revolutionin Statistics

  31. The Bayesian Revolutionin Psychology?

  32. Are Psychologists Inconsistent? • The content of Psych Review shows that • Psychologists are happy to develop Bayesian models for human cognition and human behavior based on the assumption that agents or people process noisy information in a rational or optimal way; • But psychologist do not use Bayesian models to analyze their own data statistically!

  33. Why Bayes is Now Popular Markov chain Monte Carlo!

  34. Markov Chain Monte Carlo • Instead of calculating the posterior analytically, numerical techniques such as MCMC approximate the posterior by drawing samples from it. • Consider again our earlier example…

  35. Mode = 0.89 95% confidence interval: (0.59, 0.98) With 9000 samples, almost identical to analytical result.

  36. Want to Know MoreAbout MCMC?

  37. MCMC • WithMCMC, the models you can build and estimate are said to be “limited only by the user’s imagination”. • But how do you get MCMC to work? • Option 1: write the code it yourself. • Option 2: use WinBUGS!

  38. Outline • Bayes in a Nutshell • The Inevitability of Probability • Bayesian Revolutions • This Course

  39. A Workshop in Bayesian Modeling for Cognitive Science Eric-Jan Wagenmakers

  40. The Bayesian Book • …is a course book used at UvA and UCI. • …is still regularly updated. • ….is freely available at my homepage, at http://www.ejwagenmakers.com/BayesCourse/BayesBook.html • …greatly benefits from your suggestions for improvement! [e.g., typos, awkward sentences, etc.]

  41. Contributors Michael Lee http://www.socsci.uci.edu/~mdlee/

  42. Contributors Dora Matzke

  43. Contributors Ruud Wetzels http://www.ruudwetzels.com/

  44. Why We Like Graphical Bayesian Modeling • It is fun. • It is cool. • It is easy. • It is principled. • It is superior. • It is useful. • It is flexible.

  45. Our Goals These Weeks Are… • For you to experience some of the possibilities that WinBUGS has to offer. • For you to get some hands-on training by trying out some programs. • For you to work at your own pace. • For you to get answers to questions when you get stuck.

  46. Our Goals These Weeks Are NOT… • For you become a Bayesian graphical modeling expert in one week. • For you to gain deep insight in the statistical foundations of Bayesian inference. • For you to get frustrated when the programs do not work or you do not understand the materials (please ask questions).

  47. Want to Know MoreAbout Bayes?

  48. Want to Know MoreAbout Bayes?

More Related