Unit 7: Chance Variation. Repeating with Excel Two EV’s and two SE’s Suppose we know the box (model) of a game, and we make many draws from the box with.

Slides:



Advertisements
Similar presentations
Chapters 16, 17, and 18. Flipping a coin n times, or rolling the same die n times, or spinning a roulette wheel n times, or drawing a card from a standard.
Advertisements

Probability Three basic types of probability: Probability as counting
A Sampling Distribution
6-1 Stats Unit 6 Sampling Distributions and Statistical Inference - 1 FPP Chapters 16-18, 20-21, 23 The Law of Averages (Ch 16) Box Models (Ch 16) Sampling.
Ch. 17 The Expected Value & Standard Error Review of box models 1.Pigs – suppose there is a 40% chance of getting a “trotter”. Toss a pig 20 times. –What.
Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics.
Algebra 1 Ch 2.8 – Probability & Odds.
Expected Value, the Law of Averages, and the Central Limit Theorem
Ch.18 Normal approximation using probability histograms Review measures center and spread –List of numbers (histogram of data) –Box model For a “large”
Card Counting What is it, how does it work, and can I use it to pay for college? (and if it does work, do I even have to go to college?) Jeff O’Connell.
How can you tell which is experimental and which is theoretical probability? You tossed a coin 10 times and recorded a head 3 times, a tail 7 times.
The Geometric Distributions Section Starter Fred Funk hits his tee shots straight most of the time. In fact, last year he put 78% of his.
Chapter 7 Introduction to Sampling Distributions
Understanding Randomness
MA 102 Statistical Controversies Wednesday, April 10, 2002 Today: Expected value The game of Quick Draw Reading (for Friday): Chapter 20 – The House Edge.
Ch.18 Normal approximation using probability histograms Review measures of center and spread For a “large” number of draws, a histogram of observed sums.
Probability And Expected Value ————————————
Random Variables A Random Variable assigns a numerical value to all possible outcomes of a random experiment We do not consider the actual events but we.
 The Law of Large Numbers – Read the preface to Chapter 7 on page 388 and be prepared to summarize the Law of Large Numbers.
Ch. 17 The Expected Value & Standard Error Review box models –Examples 1.Pigs – assume 40% chance of getting a “trotter” -20 tosses 2.Coin toss – 20 times.
Law of Averages (Dr. Monticino)
Algebra 1 Probability & Odds. Objective  Students will find the probability of an event and the odds of an event.
A Sampling Distribution
AP Statistics: Section 8.2 Geometric Probability.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Independence and Bernoulli.
Probability: Simple and Compound Independent and Dependent Experimental and Theoretical.
Probability Rules!. ● Probability relates short-term results to long-term results ● An example  A short term result – what is the chance of getting a.
By C. Kohn Waterford Agricultural Sciences.   A major concern in science is proving that what we have observed would occur again if we repeated the.
Lecture 9. If X is a discrete random variable, the mean (or expected value) of X is denoted μ X and defined as μ X = x 1 p 1 + x 2 p 2 + x 3 p 3 + ∙∙∙
Stat 1301 Chapter 6 MEASUREMENT ERROR. National Bureau of Standards l NB 10 –supposed to be 10 grams –weighed weekly under the “same conditions”  same.
Introduction to Simulations and Experimental Probability Chapter 4.1 – An Introduction to Probability Learning goals: Describe a simulation for a given.
POSC 202A: Lecture 5 Today: Expected Value. Expected Value Expected Value- Is the mean outcome of a probability distribution. It is our long run expectation.
Expected Value and Standard Error for a Sum of Draws (Dr. Monticino)
Copyright © 2011 Pearson Education, Inc. Probability: Living with the Odds Discussion Paragraph 7B 1 web 59. Lottery Chances 60. HIV Probabilities 1 world.
P. STATISTICS LESSON 8.2 ( DAY 1 )
1.3 Simulations and Experimental Probability (Textbook Section 4.1)
Understanding Randomness
Independence and Bernoulli Trials. Sharif University of Technology 2 Independence  A, B independent implies: are also independent. Proof for independence.
Copyright © 2005 Pearson Education, Inc. Slide 7-1.
7.2 Means and variances of Random Variables (weighted average) Mean of a sample is X bar, Mean of a probability distribution is μ.
READING HANDOUT #5 PERCENTS. Container of Beads Container has 4,000 beads 20% - RED 80% - WHITE Sample of 50 beads with pallet. Population - the 4,000.
MM207 Statistics Welcome to the Unit 7 Seminar With Ms. Hannahs.
Inference: Probabilities and Distributions Feb , 2012.
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
Introduction to Probability – Experimental Probability.
Chapter 14 From Randomness to Probability. Dealing with Random Phenomena A random phenomenon: if we know what outcomes could happen, but not which particular.
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U Authors: Gary Greer (with.
Mrs. Hubbard 6 th Grade.  What is the chance that a particular event will happen? - It will rain tomorrow. - We will have school tomorrow. - We will.
The inference and accuracy We learned how to estimate the probability that the percentage of some subjects in the sample would be in a given interval by.
The accuracy of averages We learned how to make inference from the sample to the population: Counting the percentages. Here we begin to learn how to make.
The expected value The value of a variable one would “expect” to get. It is also called the (mathematical) expectation, or the mean.
Copyright © 2009 Pearson Education, Inc. Chapter 11 Understanding Randomness.
Introduction Sample surveys involve chance error. Here we will study how to find the likely size of the chance error in a percentage, for simple random.
1 Copyright © 2014, 2012, 2009 Pearson Education, Inc. Chapter 9 Understanding Randomness.
Chapter 4 Probability and Counting Rules. Introduction “The only two sure things are death and taxes” A cynical person once said.
ODDS.  Another way to describe the chance of an event occurring is with odds. The odds in favor of an event is the ratio that compares the number of.
Chapter 8: The Binomial and Geometric Distributions 8.2 – The Geometric Distributions.
Review Law of averages, expected value and standard error, normal approximation, surveys and sampling.
LEQ: What are the basic rules of probability? 9.7.
Unit 7: Chance Variation
5.2 Mean, Variance, Standard Deviation, and Expectation
Box models Coin toss = Head = Tail 1 1
Chapter 5 Sampling Distributions
Probability And Expected Value ————————————
The Law of Large Numbers
Probability Key Questions
Probability And Expected Value ————————————
Counting and Classification (Dr. Monticino)
Accuracy of Averages.
Presentation transcript:

Unit 7: Chance Variation

Repeating with Excel

Two EV’s and two SE’s Suppose we know the box (model) of a game, and we make many draws from the box with replacement (play the game many times). What should we expect to get for a sum of draws (Expected Value of sum) and for the average of the draws (EV of avg)? And how much variability should we expect in the sum (Standard Error of sum) and in the average (SE of avg)?

Formulas for EV’s n = number of draws (plays) EV for sum = n(AV of box) EV for avg = AV of box Naturally, to get from values (EV and SE) for sum to values for avg, divide by n EV for sum is clear enough

Formulas for SE’s SE for sum = (SD of box)√n SE for avg = (SD of box)/√n SE for sum takes some algebra. It gets larger as n gets larger, but only as √n. So SE for avg actually gets smaller as n gets larger. I don’t know why the authors decided to put the sum and avg formulas into different chapters. Also, they always compute the SE for avg as SE for sum divided by n -- but it is the same. Honest!

SE’s of count and % → Law of Averages If we repeat an experiment more and more times, –i.e., as n gets larger the fraction of times an event occurs will be closer to the probability of that event, –i.e., SE for % gets smaller but the difference (number of times event occurs) - (probability)(number of trials) is likely to go up –i.e., SE for count gets larger

The usefulness of the SE With which game is one more likely to make ≥ $5 in 100 rounds? (a) Flip a coin: Win $1 if H, lose $1 if T. (b) Flip 2 coins: Win $(#H - #T). (a) EV = 0, SE = (1-(-1))√[(.5)(.5)]√100 = $10 (b) EV = 0, SE = √[((-2) 2 +2(0) 2 +(2) 2 )/4]√100 = $10√2 So in (a), $5 is z = (5-0)/10 =.5; in (b) $5 is z = (5-0)/(10√2) ≈.35. So winning ≥ $5 is more likely in (b).