Download presentation

Presentation is loading. Please wait.

1
**Bayesian Statistics Without Tears: Prelude**

Eric-Jan Wagenmakers

2
**Three Schools of Statistical Inference**

Neyman-Pearson: α-level, power calculations, two hypotheses, guide for action (i.e., what to do). Fisher: p-values, one hypothesis (i.e., H0), quantifies evidence against H0. Bayes: prior and posterior distributions, attaches probabilities to parameters and hypotheses.

3
**A Freudian Analogy Neyman-Pearson: The Superego. Fisher: The Ego.**

Bayes: The Id. Claim: What Id really wants is to attach probabilities to hypotheses and parameters. This wish is suppressed by the Superego and the Ego. The result is unconscious internal conflict.

4
**Internal Conflict Causes Misinterpretations**

p < .05 means that H0 is unlikely to be true, and can be rejected. p > .10 means that H0 is likely to be true. For a given parameter μ, a 95% confidence interval from, say, a to b means that there is a 95% chance that μ lies in between a and b.

5
**Two Ways to Resolve the Internal Conflict**

Strengthen Superego and Ego by teaching the standard statistical methodology more rigorously. Suppress Id even more! Give Id what it wants.

6
**What is Bayesian Inference? Why be Bayesian?**

Eric-Jan Wagenmakers

7
**What is Bayesian Inference?**

8
**What is Bayesian Inference?**

“Common sense expressed in numbers”

9
**What is Bayesian Inference?**

“The means by which rational agents draw optimal conclusions in an uncertain environment”

10
**What is Bayesian Inference?**

“The only statistical procedure that is coherent, meaning that it avoids statements that are internally inconsistent.”

11
**What is Bayesian Inference?**

“A method for rational updating of beliefs about the world”

12
**What is Bayesian Inference?**

“The only good statistics”

13
**Outline Bayes in a Nutshell The Inevitability of Probability**

Bayesian Revolutions This Course

14
**Bayesian Inference in a Nutshell**

In Bayesian inference, uncertainty or degree of belief is quantified by probability. Prior beliefs are updated by means of the data to yield posterior beliefs.

15
**Bayesian Parameter Estimation: Example**

We prepare for you a series of 10 factual true/false questions of equal difficulty. You answer 9 out of 10 questions correctly. What is your latent probability θ of answering any one question correctly?

16
**Bayesian Parameter Estimation: Example**

We start with a prior distribution for θ. This reflect all we know about θ prior to the experiment. Here we make a standard choice and assume that all values of θ are equally likely a priori.

17
**Bayesian Parameter Estimation: Example**

We then update the prior distribution by means of the data (technically, the likelihood) to arrive at a posterior distribution. The posterior distribution is a compromise between what we knew before the experiment and what we have learned from the experiment. The posterior distribution reflects all that we know about θ.

18
Mode = 0.9 95% confidence interval: (0.59, 0.98) NB. We do not have to use the uniform prior!

19
**Outline Bayes in a Nutshell The Inevitability of Probability**

Bayesian Revolutions This Course

20
**The Inevitability of Probability**

Why would one measure “degree of belief” by means of probability? Couldn’t we choose something else that makes sense? Yes, perhaps we can, but the choice of probability is anything but ad-hoc.

21
**The Inevitability of Probability**

Assume “degree of belief” can be measured by a single number. Assume you are rational, that is, not self-contradictory or “obviously silly”. Then degree of belief can be shown to follow the same rules as the probability calculus.

22
**The Inevitability of Probability**

For instance, a rational agent would not hold intransitive beliefs, such as:

23
**The Inevitability of Probability**

When you use a single number to measure uncertainty or quantify evidence, and these numbers do not follow the rules of probability calculus, you can (almost certainly?) be shown to be silly or incoherent. One of the theoretical attractions of the Bayesian paradigm is that it ensures coherence right from the start.

24
Coherence I Coherence is also key in de Finetti’s conceptualization of probability.

25
Coherence II One aspect of coherence is that “today’s posterior is tomorrow’s prior”. Suppose we have exchangeable (iid) data x = {x1, x2}. Now we can update our prior using x, using first x1 and then x2, or using first x2 and then x1. All the procedures will result in exactly the same posterior distribution.

26
**Coherence III Assume we have three models: M1, M2, M3.**

After seeing the data, suppose that M1 is 3 times more plausible than M2, and M2 is 4 times more plausible than M3. By transitivity, M1 is 3x4=12 times more plausible than M3.

27
**Outline Bayes in a Nutshell The Inevitability of Probability**

Bayesian Revolutions This Course

28
**The Bayesian Revolution**

Until about 1990, Bayesian statistics could only be applied to a select subset of very simple models. Only recently, Bayesian statistics has undergone a transformation; With current numerical techniques, Bayesian models are “limited only by the user’s imagination.”

29
**The Bayesian Revolution in Statistics**

30
**The Bayesian Revolution in Statistics**

31
**The Bayesian Revolution in Psychology?**

32
**Are Psychologists Inconsistent?**

The content of Psych Review shows that Psychologists are happy to develop Bayesian models for human cognition and human behavior based on the assumption that agents or people process noisy information in a rational or optimal way; But psychologist do not use Bayesian models to analyze their own data statistically!

33
**Why Bayes is Now Popular**

Markov chain Monte Carlo!

34
**Markov Chain Monte Carlo**

Instead of calculating the posterior analytically, numerical techniques such as MCMC approximate the posterior by drawing samples from it. Consider again our earlier example…

36
Mode = 0.89 95% confidence interval: (0.59, 0.98) With 9000 samples, almost identical to analytical result.

38
**Want to Know More About MCMC?**

39
MCMC With MCMC, the models you can build and estimate are said to be “limited only by the user’s imagination”. But how do you get MCMC to work? Option 1: write the code it yourself. Option 2: use WinBUGS!

40
**Outline Bayes in a Nutshell The Inevitability of Probability**

Bayesian Revolutions This Course

41
**A Workshop in Bayesian Modeling for Cognitive Science**

Eric-Jan Wagenmakers

42
**The Bayesian Book …is a course book used at UvA and UCI.**

…is still regularly updated. ….is freely available at my homepage, at …greatly benefits from your suggestions for improvement! [e.g., typos, awkward sentences, etc.]

43
Contributors Michael Lee

44
Contributors Dora Matzke

45
Contributors Ruud Wetzels

46
**Why We Like Graphical Bayesian Modeling**

It is fun. It is cool. It is easy. It is principled. It is superior. It is useful. It is flexible.

47
**Our Goals These Weeks Are…**

For you to experience some of the possibilities that WinBUGS has to offer. For you to get some hands-on training by trying out some programs. For you to work at your own pace. For you to get answers to questions when you get stuck.

48
**Our Goals These Weeks Are NOT…**

For you become a Bayesian graphical modeling expert in one week. For you to gain deep insight in the statistical foundations of Bayesian inference. For you to get frustrated when the programs do not work or you do not understand the materials (please ask questions).

49
**Want to Know More About Bayes?**

50
**Want to Know More About Bayes?**

51
**Bayesian inference Using Gibbs Sampling**

WinBUGS Bayesian inference Using Gibbs Sampling You want to have this installed (plus the registration key)

52
**WinBUGS Knows many probability distributions (likelihoods);**

Allows you to specify a model; Allows you to specify priors; Will then automatically run the MCMC sampling routines and produce output.

53
**Want to Know More About WinBUGS?**

54
**WinBUGS & R WinBUGS produces MCMC samples.**

We want to analyze the output in a nice program, such as R. This can be accomplished using the R package “R2WinBUGS”

55
**R: “Here’s the data and a bunch of commands”**

WinBUGS: “OK, I did what you wanted, here’s the samples you asked for”

56
**Getting Started Work through some of the exercises of the book.**

Most of you will want to get started with the chapter “getting started”.

57
Running the R programs The R scripts have extension .R. You can use “File” -> “Open Script” to read these. You can run these scripts by copying-and-pasting the scripts in the R console.

58
Course Webpage Check out for lectures and a pdf file with answers to the exercises!

59
Questions? Feel free to ask questions when you are really stuck.

Similar presentations

OK

1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.

1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on forward rate agreement currency Ppt on c++ programming language Ppt on rural tourism in india Ppt on 8 wonders of the world Ppt on hindu religion diet Ppt on indian textile industries in india Ppt on earth dam design Ppt on object-oriented concepts in php Ppt on security features of atmosphere A ppt on taj mahal