Download presentation

Presentation is loading. Please wait.

1
Bayesian Statistics Simon French simon.french@warwick.ac.uk

2
The usual view of statistics What does the data – and only the data –tell us in relation to the research questions of interest? By focusing on the data alone, we are ‘clearly’ being objective….

3
But … … classical/frequentist tatistical methods contain hidden subjective choices …. Why choose 1% or 5% as significance levels? Why choose a minimum variance unbiased estimate rather than a maximum likelihood estimator which might be biased but lead to tighter bounds? ….

4
The Bayesian paradigm … … is explicitly subjective. It models judgements and explores their implications –probabilities to represent beliefs and uncertainties –(and utilities to represent values and costs so that inferences lead transparently to decisions) is based upon a model of an idealised (consistent, rational) scientist focuses first on the individual scientist; then by varying the scientist’s beliefs enables the exploration of potential consensus. For a Bayesian, knowledge is based on consensus

5
The Bayesian view of statistics What are we uncertain about and how does the data reduce that uncertainty? not What does the data – and only the data –tell us in relation to the research questions of interest?

6
Rev. Thomas Bayes 1701?-1761 Main work published posthumously: T. Bayes (1763) An essay towards solving a problem in the doctrine of chances. Phil Trans Roy. Soc. 53 370-418 Bayes Theorem – inverse probability

7
Bayes theorem Posterior probability likelihood prior probability p( | x) p(x | ) × p( )

8
Bayes theorem Posterior probability likelihood prior probability p( | x) p(x | ) × p( ) Our knowledge before the experiment Probability distribution of parameters p( )

9
Bayes theorem Posterior probability likelihood prior probability p( | x) p(x | ) × p( ) Our knowledge of the design of the experiment or survey and the actual data likelihood of data given parameters p(x| )

10
Bayes theorem Posterior probability likelihood prior probability p( | x) p(x | ) × p( ) Our knowledge after the experiment Probability distribution of parameters given data p( |x)

11
Bayes theorem Posterior probability likelihood prior probability p( | x) p(x | ) × p( ) There is a constant, but ‘easy’ to find as probability adds (integrates) to one

12
Medical Test Probability of having disease = 0.001 –i.e. 1 in 1000 –Probability of not having disease = 0.999 Test has 95% of detecting disease if present; but 2% of falsely detecting it if absent –False negative rate= 5% False positive rate= 2% Disease Test GeNIe Software: http://genie.sis.pitt.edu http://genie.sis.pitt.edu

13
Simple Bayes Normal Model:

14
Prior Posterior Toss a biased coin 12 times; obtain 9 heads Bayes Theorem as applied to Statistics

15
15 Prior Posterior Toss a biased coin 12 times; obtain 9 heads Bayesian Estimation Take mean, median or mode

16
Prior Posterior Toss a biased coin 12 times; obtain 9 heads Bayesian confidence interval Highest 95% density

17
Prior Posterior Toss a biased coin 12 times; obtain 9 heads Bayesian hypothesis test To test H 0 : 1 > 0.6 look at Prob ( 1 >0.6)

18
But why do any of these? Just report the posterior. It encodes all that is known about 1

19
Bayesian decision analysis Decision? Science Model uncertainties with probabilities Values Model preferences with multi-attribute utilities Data Observe data X = x from p X (· | ) feedback to future decisions Bayes Theorem Combine Advice Statistics Decision and Risk Analysis

20
Bayes Calculations Analytic approaches –conjugate families of distributions –Kalman filters Numerical integration –Quadrature –Asymptotic expansions Markov Chain Monte Carlo (MCMC) –Gibbs Sampling, Particle filters –Almost any distributions and models

21
Modelling uncertainty Might be better to say Bayesians practice uncertainty modelling There are simple modelling strategies and tools for this –hierarchical modelling –belief nets –….

22
Bayes theorem In real problems, x and are multi-dimensional –with ‘big data’, very high dimensional Can we restructure p(x, ) to be easier to work with? –e.g. to draw in and use independence structures, etc. p( | x) p(x | ) × p( ) =p(x, )

23
Hierarchical Models Simple Bayes Normal Model: Three Stage Bayes Normal Model: 11 22 nn …. X1X1 X2X2 XnXn

24
The Asia Belief Net Visit to Asia? Smoking? Tuberculosis Lung Cancer Bronchitis X-Ray Result? Dyspnea?

25
Subjectivity vs Objectivity Bayesian statistics is explicitly subjective Science is (thought to be) objective controversy! 25

26
26 Importance of prior Different priors lead to different conclusions subjective not scientific? Can use: –ignorant (vague, non-informative) prior to ‘let data speak for themselves’ –precise prior to capture agreed common knowledge –Sensitivity analysis to explore the importance of the priors Indeed can use sensitivity analysis to explore agreements and disagreements on many aspects of the model not just the prior If Science is about a consensus on knowledge, then exploring a range of priors helps establish precisely that

27
All analysis assumes a model … Another subjective choice and one not often address in any discussion of methodology –same is true in classical/frequentist statistics Bayesian analysis provides an assessment of uncertainties in the context of the assumed model –same is true in classical/frequentist statistics: e.g. p values Real world uncertainty includes these but more that arise from the fact the model is not the real world

28
BUGS Software Bayesian inference Using Gibbs Sampling –Lunn, D.J., Thomas, A., Best, N., and Spiegelhalter, D. (2000) WinBUGS -- a Bayesian modelling framework: concepts, structure, and extensibility. Statistics and Computing, 10:325−337 –Lunn, D. J., Jackson, C., Best, N., Thomas, A. and Spiegelhalter, D. (2013). The BUGS Book: a Practical Introduction to Bayesian Analysis. London, Chapman and Hall. –http://www.mrc-bsu.cam.ac.uk/bugs/http://www.mrc-bsu.cam.ac.uk/bugs/ 28

29
29 Reading W.M. Bolstad (2007). Introduction to Bayesian Statistics. 2 nd E dn, Hoboken, NJ, John Wiley and Sons. P. M. Lee (2012). Bayesian Statistics: An Introduction. 4 th E dn, Chichester, John Wiley and Sons. R. Christensen, W. Johnson, A. Branscum and T.E. Hanson (2011) Bayesian Ideas and Data Analysis. Boca Raton, CRC/Chapman and Hall P. Congdon (2001) Bayesian Statistical Modelling. Chichester, John Wiley and Sons S. French and D. Rios Insua (2000). Statistical Decision Theory. London, Arnold. A. O'Hagan and J. Forester (2004). Bayesian Statistics. London, Edward Arnold. J.M. Bernardo and A.F.M. Smith (1994). Bayesian Theory. Chichester, John Wiley and Sons.

30
ISBA International Society for Bayesian Analysis www.bayesian.org Many resources and guide to software, literature, etc. Newsletter Open journal: Bayesian Analysis 30

31
Thank you

Similar presentations

OK

Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.

Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on wireless technology Ppt on crash fire tender picture Ppt on computer application in daily life Download ppt on coordinate geometry for class 9th 2015 Download ppt on gender discrimination in india Ppt on structure of chromosomes Ppt on virtual file system Ppt on vegetarian and non vegetarian Free ppt on team building and leadership Seminar ppt on mobile number portability