Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ch15: Decision Theory & Bayesian Inference 15.1: INTRO: We are back to some theoretical statistics: 1.Decision Theory –Make decisions in the presence of.

Similar presentations


Presentation on theme: "Ch15: Decision Theory & Bayesian Inference 15.1: INTRO: We are back to some theoretical statistics: 1.Decision Theory –Make decisions in the presence of."— Presentation transcript:

1 Ch15: Decision Theory & Bayesian Inference 15.1: INTRO: We are back to some theoretical statistics: 1.Decision Theory –Make decisions in the presence of uncertainty 2.Bayesian Inference –Alternative to traditional (“frequentist”) method

2 15.2: Decision Theory New Terminology: (true) state of nature = parameter action :choice based on the observation of data, or a random variable, X, whose CDF depends on (statistical) decision function: loss function: risk function = expected loss

3 Example: Game Theory A = Manager of oil Company vs B = Opponent (Nature) Situation: Is there any oil at a given location? Each of the players A and B has the choice of 2 moves: A has the choice between actions to continue or to stop drilling B controls the choice between parameters whether there is oil or not.

4 15.2.1: Bayes & Minimax Rules “good decision” with smaller risk What If To go around, use either a Minimax or a Bayes Rule: Minimax Rule: (minimize the maximum risk) Bayes Rule: (minimize the Bayes risk)

5 Classical Stat. vs Bayesian Stat. Classical (or Frequentist): Unknown but fixed parameters to be estimated from the data.  Bayesian: Parameters are random variables. Data and prior are combined to estimate posterior. The same picture as above with some Prior Information Inference and/or Prediction Model Data

6 15.2.2: Posterior Analysis Bayesians look at the parameter as a random variable with prior dist’n and a posterior distribution

7 15.2.3: Classification & Hypothesis Testing Wish: classify an element as belonging to one of the classes partitioning a population of interest. e.g. e.g. an utterance will be classified by a computer as one of the words in its dictionary via sound measurements. Hypothesis testing can be seen as a classification matter with a constraint on the probability of misclassification (the probability of type I error).

8 15.2.4: Estimation

9 15.2.4: Estimation (example)

10 15.3.1: Bayesian Inference for the Normal Distribution

11 How is the prior distribution altered by a random sample ?

12 15.3.2: The Beta Dist’n is a conjugate prior to the Binomial


Download ppt "Ch15: Decision Theory & Bayesian Inference 15.1: INTRO: We are back to some theoretical statistics: 1.Decision Theory –Make decisions in the presence of."

Similar presentations


Ads by Google