- 1 - Bayesian inference of binomial problem Estimating a probability from binomial data –Objective is to estimate unknown proportion (or probability of.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Bayes rule, priors and maximum a posteriori
Commonly Used Distributions
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 6 Point Estimation.
Bayesian inference of normal distribution
Bayesian inference “Very much lies in the posterior distribution” Bayesian definition of sufficiency: A statistic T (x 1, …, x n ) is sufficient for 
Bayesian Estimation in MARK
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Chapter 7 Title and Outline 1 7 Sampling Distributions and Point Estimation of Parameters 7-1 Point Estimation 7-2 Sampling Distributions and the Central.
Flipping A Biased Coin Suppose you have a coin with an unknown bias, θ ≡ P(head). You flip the coin multiple times and observe the outcome. From observations,
Sampling Distributions (§ )
ELEC 303 – Random Signals Lecture 18 – Statistics, Confidence Intervals Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 10, 2009.
Descriptive statistics Experiment  Data  Sample Statistics Sample mean Sample variance Normalize sample variance by N-1 Standard deviation goes as square-root.
Statistical Inference Chapter 12/13. COMP 5340/6340 Statistical Inference2 Statistical Inference Given a sample of observations from a population, the.
Basics of Statistical Estimation. Learning Probabilities: Classical Approach Simplest case: Flipping a thumbtack tails heads True probability  is unknown.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference (Sec. )
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
8 Statistical Intervals for a Single Sample CHAPTER OUTLINE
Edpsy 511 Homework 1: Due 2/6.
1 STATISTICAL INFERENCE PART I EXPONENTIAL FAMILY & POINT ESTIMATION.
Review What you have learned in QA 128 Business Statistics I.
8-1 Introduction In the previous chapter we illustrated how a parameter can be estimated from sample data. However, it is important to understand how.
Chapter Two Probability Distributions: Discrete Variables
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Bayesian Inference, Basics Professor Wei Zhu 1. Bayes Theorem Bayesian statistics named after Thomas Bayes ( ) -- an English statistician, philosopher.
1 Introduction to Estimation Chapter Concepts of Estimation The objective of estimation is to determine the value of a population parameter on the.
© 2003 Prentice-Hall, Inc.Chap 6-1 Business Statistics: A First Course (3 rd Edition) Chapter 6 Sampling Distributions and Confidence Interval Estimation.
Population All members of a set which have a given characteristic. Population Data Data associated with a certain population. Population Parameter A measure.
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
Statistics for Data Miners: Part I (continued) S.T. Balke.
Random Sampling, Point Estimation and Maximum Likelihood.
Chapter 4 Statistics. 4.1 – What is Statistics? Definition Data are observed values of random variables. The field of statistics is a collection.
Bayesian inference review Objective –estimate unknown parameter  based on observations y. Result is given by probability distribution. Bayesian inference.
Exam I review Understanding the meaning of the terminology we use. Quick calculations that indicate understanding of the basis of methods. Many of the.
The Posterior Distribution. Bayesian Theory of Evolution!
Ch 2. Probability Distributions (1/2) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by Yung-Kyun Noh and Joo-kyung Kim Biointelligence.
Likelihood function and Bayes Theorem In simplest case P(B|A) = P(A|B) P(B)/P(A) and we consider the likelihood function in which we view the conditional.
Statistical Inference Statistical Inference is the process of making judgments about a population based on properties of the sample Statistical Inference.
Determination of Sample Size: A Review of Statistical Theory
Example: Bioassay experiment Problem statement –Observations: At each level of dose, 5 animals are tested, and number of death are observed.
1 Topic 5 - Joint distributions and the CLT Joint distributions –Calculation of probabilities, mean and variance –Expectations of functions based on joint.
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
1 Francisco José Vázquez Polo [ Miguel Ángel Negrín Hernández [ {fjvpolo or
CLASS: B.Sc.II PAPER-I ELEMENTRY INFERENCE. TESTING OF HYPOTHESIS.
1 Bayesian Essentials Slides by Peter Rossi and David Madigan.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Summarizing Risk Analysis Results To quantify the risk of an output variable, 3 properties must be estimated: A measure of central tendency (e.g. µ ) A.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
CHAPTER 9 Inference: Estimation The essential nature of inferential statistics, as verses descriptive statistics is one of knowledge. In descriptive statistics,
- 1 - Matlab statistics fundamentals Normal distribution % basic functions mew=100; sig=10; x=90; normpdf(x,mew,sig) 1/sig/sqrt(2*pi)*exp(-(x-mew)^2/sig^2/2)
The Uniform Prior and the Laplace Correction Supplemental Material not on exam.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Ch 2. Probability Distributions (1/2) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by Joo-kyung Kim Biointelligence Laboratory,
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
Bayesian Estimation and Confidence Intervals Lecture XXII.
Parameter, Statistic and Random Samples
Bayesian Estimation and Confidence Intervals
Sampling Distributions and Estimation
More about Posterior Distributions
Bayesian Inference, Basics
Statistical NLP: Lecture 4
Example Human males have one X-chromosome and one Y-chromosome,
Parametric Methods Berlin Chen, 2005 References:
Learning From Observed Data
Sampling Distributions (§ )
Mathematical Foundations of BME Reza Shadmehr
Applied Statistics and Probability for Engineers
Introductory Statistics
Classical regression review
Presentation transcript:

- 1 - Bayesian inference of binomial problem Estimating a probability from binomial data –Objective is to estimate unknown proportion (or probability of success, i.e., to get 1) from Bernoulli trials data y 1, y 2, …, y n that consists of either 0 or 1. –Let the parameter  be the proportion of success in the population, or equivalently probability of success. –Then the probability to obtain y number of successes in n trials which is called binomial distribution. This is discrete function w.r.t. y. Practice y = 0:10; n=10; p=0.5; pmf = nchoosek(n,y)*p^y*(1-p)^(n-y); y = 0:10; n=10; p=0.5; pmf = binopdf(y,n,p); plot(y,pmf,'+')

- 2 - Bayesian inference of binomial problem Inference problem statement –Let the parameter  be the proportion of females in the population. –Current accepted value in Europe is 0.485, less than 0.5. –Estimate  conditional on the observed data: y females out of n births. –Simplest way is just to let  = y/n. Bayesian inference –Assume non-informative prior:  ~ uniform on [0, 1]. –Likelihood: –Posterior density:

- 3 - Bayesian inference of binomial problem Illustrative results –Several different experiments but with same proportion of successes where sample sizes vary. –Interpret the meaning of the figures. –Practice: plot the four cases using matlab function.

- 4 - Bayesian inference of binomial problem Beta pdf –In fact, the posterior density is beta distribution with parameters  = y+1,  = n-y+1. –Practice : plot the four cases using beta pdf function. Laplace in 18 th Century –241,945 girls, 251,527 boys in Paris during 1745 ~ –P[  ≥ 0.5 | y] ≈ 1.15  So he was ‘certain’ that  < 0.5. –Practice: calculate this value, and validate. Posterior prediction –What is the probability to get girl if a new baby born ? –Practice: calculate this value. What if the numbers were 2 out of 5 ?

- 5 - Bayesian inference of binomial problem Summarizing posterior inference –Locations summary: Mean: expectation, needs integration. Mode: most-likely value. Maximum of pdf. Needs optimization or d(pdf)/dx. Median: 50% percentile value. Among these, mode is preferred due to the computational convenience. –Variations summary: Standard deviation or variance Interquartile range or 100(1-  )% interval –Practice with beta pdf. Mean, mode are given as equation analytically. Others are obtained using matlab functions. –In general, these values are computed using computer simulations from the posterior distribution.

- 6 - Bayesian inference of binomial problem Informative prior –So far considered only uniform prior. –Recall the likelihood is binomial pdf: –Let us introduce prior of beta pdf: where  are called hyperparameters. –Then posterior density: Remarks –Property that posterior distribution follows same form as the prior is called conjugacy. Beta prior is conjugate family of binomial likelihood. –As a result, the mean & variance of posterior pdf: –As y & n-y become large under fixed , In the limit, parameters of the prior have no influence on posterior. Besides, it converges to normal pdf due to central limit theorem.

- 7 - Homework 2.5 example: estimating probability of female birth –P[  < ] –Histogram of posterior pdf  |y –Median and 95% confidence intervals. Ans.446, [.415,.477]