Ch15: Decision Theory & Bayesian Inference 15.1: INTRO: We are back to some theoretical statistics: 1.Decision Theory –Make decisions in the presence of.

Slides:



Advertisements
Similar presentations
Bayes rule, priors and maximum a posteriori
Advertisements

Probability models- the Normal especially.
“Students” t-test.
Statistical Decision Theory Abraham Wald ( ) Wald’s test Rigorous proof of the consistency of MLE “Note on the consistency of the maximum likelihood.
INTRODUCTION TO MACHINE LEARNING Bayesian Estimation.
Pattern Recognition and Machine Learning
20- 1 Chapter Twenty McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Copyright © 2002 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter Twenty An Introduction to Decision Making GOALS.
Bayesian inference “Very much lies in the posterior distribution” Bayesian definition of sufficiency: A statistic T (x 1, …, x n ) is sufficient for 
LECTURE 11: BAYESIAN PARAMETER ESTIMATION
Sampling: Final and Initial Sample Size Determination
Flipping A Biased Coin Suppose you have a coin with an unknown bias, θ ≡ P(head). You flip the coin multiple times and observe the outcome. From observations,
Sampling Distributions (§ )
Probability & Statistical Inference Lecture 7 MSc in Computing (Data Analytics)
Chapter Seventeen HYPOTHESIS TESTING
Statistical Decision Theory, Bayes Classifier
Basics of Statistical Estimation. Learning Probabilities: Classical Approach Simplest case: Flipping a thumbtack tails heads True probability  is unknown.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference (Sec. )
Presenting: Assaf Tzabari
Machine Learning CMPT 726 Simon Fraser University
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 10 Notes Class notes for ISE 201 San Jose State University.
4-1 Statistical Inference The field of statistical inference consists of those methods used to make decisions or draw conclusions about a population.
Thanks to Nir Friedman, HU
One Sample  M ean μ, Variance σ 2, Proportion π Two Samples  M eans, Variances, Proportions μ1 vs. μ2 σ12 vs. σ22 π1 vs. π Multiple.
1 © Lecture note 3 Hypothesis Testing MAKE HYPOTHESIS ©
A quick intro to Bayesian thinking 104 Frequentist Approach 10/14 Probability of 1 head next: = X Probability of 2 heads next: = 0.51.
Inference for the mean vector. Univariate Inference Let x 1, x 2, …, x n denote a sample of n from the normal distribution with mean  and variance 
Confidence Intervals Confidence Interval for a Mean
AP Statistics Chapter 9 Notes.
Statistical Decision Theory
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Week 41 Estimation – Posterior mean An alternative estimate to the posterior mode is the posterior mean. It is given by E(θ | s), whenever it exists. This.
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Confidence Interval & Unbiased Estimator Review and Foreword.
Bayesian Prior and Posterior Study Guide for ES205 Yu-Chi Ho Jonathan T. Lee Nov. 24, 2000.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Bayes Theorem. Prior Probabilities On way to party, you ask “Has Karl already had too many beers?” Your prior probabilities are 20% yes, 80% no.
Ch8.2 Ch8.2 Population Mean Test Case I: A Normal Population With Known Null hypothesis: Test statistic value: Alternative Hypothesis Rejection Region.
Statistical Inference Statistical inference is concerned with the use of sample data to make inferences about unknown population parameters. For example,
Stat 31, Section 1, Last Time Distribution of Sample Means –Expected Value  same –Variance  less, Law of Averages, I –Dist’n  Normal, Law of Averages,
Hypothesis Testing Steps for the Rejection Region Method State H 1 and State H 0 State the Test Statistic and its sampling distribution (normal or t) Determine.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Lec. 19 – Hypothesis Testing: The Null and Types of Error.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Bayesian Estimation and Confidence Intervals Lecture XXII.
Applied statistics Usman Roshan.
Measurement, Quantification and Analysis
Bayesian Estimation and Confidence Intervals
STATISTICAL INFERENCE
Chapter 4. Inference about Process Quality
Chapter Six Normal Curves and Sampling Probability Distributions
Special Topics In Scientific Computing
CONCEPTS OF HYPOTHESIS TESTING
OVERVIEW OF BAYESIAN INFERENCE: PART 1
More about Posterior Distributions
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Pattern Recognition and Machine Learning
CHAPTER 6 Statistical Inference & Hypothesis Testing
Mathematical Foundations of BME
LECTURE 23: INFORMATION THEORY REVIEW
LECTURE 07: BAYESIAN ESTIMATION
Bayes for Beginners Luca Chech and Jolanda Malamud
Parametric Methods Berlin Chen, 2005 References:
Sampling Distributions (§ )
CS639: Data Management for Data Science
Bayesian Statistics on a Shoestring Assaf Oron, May 2008
Mathematical Foundations of BME Reza Shadmehr
Presentation transcript:

Ch15: Decision Theory & Bayesian Inference 15.1: INTRO: We are back to some theoretical statistics: 1.Decision Theory –Make decisions in the presence of uncertainty 2.Bayesian Inference –Alternative to traditional (“frequentist”) method

15.2: Decision Theory New Terminology: (true) state of nature = parameter action :choice based on the observation of data, or a random variable, X, whose CDF depends on (statistical) decision function: loss function: risk function = expected loss

Example: Game Theory A = Manager of oil Company vs B = Opponent (Nature) Situation: Is there any oil at a given location? Each of the players A and B has the choice of 2 moves: A has the choice between actions to continue or to stop drilling B controls the choice between parameters whether there is oil or not.

15.2.1: Bayes & Minimax Rules “good decision” with smaller risk What If To go around, use either a Minimax or a Bayes Rule: Minimax Rule: (minimize the maximum risk) Bayes Rule: (minimize the Bayes risk)

Classical Stat. vs Bayesian Stat. Classical (or Frequentist): Unknown but fixed parameters to be estimated from the data.  Bayesian: Parameters are random variables. Data and prior are combined to estimate posterior. The same picture as above with some Prior Information Inference and/or Prediction Model Data

15.2.2: Posterior Analysis Bayesians look at the parameter as a random variable with prior dist’n and a posterior distribution

15.2.3: Classification & Hypothesis Testing Wish: classify an element as belonging to one of the classes partitioning a population of interest. e.g. e.g. an utterance will be classified by a computer as one of the words in its dictionary via sound measurements. Hypothesis testing can be seen as a classification matter with a constraint on the probability of misclassification (the probability of type I error).

15.2.4: Estimation

15.2.4: Estimation (example)

15.3.1: Bayesian Inference for the Normal Distribution

How is the prior distribution altered by a random sample ?

15.3.2: The Beta Dist’n is a conjugate prior to the Binomial