Probability and Statistics for Biomedical Engineering ( 의료통계학 ) 확률변수 Probability Distribution Probability & Statistics for Biomedical Engineering (2015.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

CS433: Modeling and Simulation
Presentation on Probability Distribution * Binomial * Chi-square
Probability Distributions CSLU 2850.Lo1 Spring 2008 Cameron McInally Fordham University May contain work from the Creative Commons.
Introduction to Probability Distributions
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Introduction to Probability Distributions. Random Variable A random variable X takes on a defined set of values with different probabilities. For example,
Probability & Statistical Inference Lecture 3
Probability theory Much inspired by the presentation of Kren and Samuelsson.
Probability Distributions
TDC 369 / TDC 432 April 2, 2003 Greg Brewster. Topics Math Review Probability –Distributions –Random Variables –Expected Values.
A gentle introduction to Gaussian distribution. Review Random variable Coin flip experiment X = 0X = 1 X: Random variable.
CHAPTER 6 Statistical Analysis of Experimental Data
Probability Distributions Random Variables: Finite and Continuous Distribution Functions Expected value April 3 – 10, 2003.
Random variables; discrete and continuous probability distributions June 23, 2004.
Review of Probability and Binomial Distributions
Probability Distribution
RANDOM VARIABLES.
Examples of continuous probability distributions: The normal and standard normal.
Expected value and variance; binomial distribution June 24, 2004.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Random Variables A random variable A variable (usually x ) that has a single numerical value (determined by chance) for each outcome of an experiment A.
Problem A newly married couple plans to have four children and would like to have three girls and a boy. What are the chances (probability) their desire.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
All of Statistics Chapter 5: Convergence of Random Variables Nick Schafer.
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Review and Preview This chapter combines the methods of descriptive statistics presented in.
Chapter 4 Probability Distributions
Theory of Probability Statistics for Business and Economics.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 11 Section 1 – Slide 1 of 34 Chapter 11 Section 1 Random Variables.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
Random Variables A random variable is simply a real-valued function defined on the sample space of an experiment. Example. Three fair coins are flipped.
5.3 Random Variables  Random Variable  Discrete Random Variables  Continuous Random Variables  Normal Distributions as Probability Distributions 1.
DISCRETE PROBABILITY DISTRIBUTIONS
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Introduction to Probability and Probability Distributions.
Random Variables Presentation 6.. Random Variables A random variable assigns a number (or symbol) to each outcome of a random circumstance. A random variable.
A random variable is a variable whose values are numerical outcomes of a random experiment. That is, we consider all the outcomes in a sample space S and.
Sections 5.1 and 5.2 Review and Preview and Random Variables.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Binomial Distributions IB Math SL1 - Santowski. The Binomial Setting F ixed number of n trials I ndependence T wo possible outcomes: success or failure.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
The Binomial Distribution.  If a coin is tossed 4 times the possibilities of combinations are  HHHH  HHHT, HHTH, HTHH, THHHH  HHTT,HTHT, HTTH, THHT,
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
Chapter 16 Week 6, Monday. Random Variables “A numeric value that is based on the outcome of a random event” Example 1: Let the random variable X be defined.
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U.
Examples of discrete probability distributions:
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
Review Know properties of Random Variables
A random variable is a variable whose values are numerical outcomes of a random experiment. That is, we consider all the outcomes in a sample space S and.
Discrete Probability Distributions. Random Variable A random variable X takes on a defined set of values with different probabilities. For example, if.
Chapter 8: Probability: The Mathematics of Chance Probability Models and Rules 1 Probability Theory  The mathematical description of randomness.  Companies.
Chapter 5 Probability Distributions 5-1 Overview 5-2 Random Variables 5-3 Binomial Probability Distributions 5-4 Mean, Variance and Standard Deviation.
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U Authors: Gary Greer (with.
The expected value The value of a variable one would “expect” to get. It is also called the (mathematical) expectation, or the mean.
Probability Distributions. Constructing a Probability Distribution Definition: Consists of the values a random variable can assume and the corresponding.
The Law of Averages. What does the law of average say? We know that, from the definition of probability, in the long run the frequency of some event will.
Probability & Random Variables
Chapter 23C: Expected Values of Discrete Random Variables The mean, or expected value, of a discrete random variable is 1.
Lesson 96 – Expected Value & Variance of Discrete Random Variables HL2 Math - Santowski.
Probability Distributions ( 확률분포 ) Chapter 5. 2 모든 가능한 ( 확률 ) 변수의 값에 대해 확률을 할당하는 체계 X 가 1, 2, …, 6 의 값을 가진다면 이 6 개 변수 값에 확률을 할당하는 함수 Definition.
MAT 446 Supplementary Note for Ch 3
Introduction to Probability Distributions
Probability Distributions; Expected Value
Introduction to Probability Distributions
Examples of discrete probability distributions:
Random Variables Binomial Distributions
Presentation transcript:

Probability and Statistics for Biomedical Engineering ( 의료통계학 ) 확률변수 Probability Distribution Probability & Statistics for Biomedical Engineering (2015 Spring) Prof. Jae Young Choi Prof. Jae Young Choi, Ph.D. Biomedical Informatics and Pattern Recognition Lab. Department of Biomedical Engineering URL:

Review of Probability Random Variables  A random variable, x, is a real-valued function defined on the events of the sample space, S. In words, for each event in S, there is a real number that is the corresponding value of the random variable.  Viewed yet another way, a random variable maps each event in S onto the real line. That is it. A simple, straightforward definition. X Event s R.V X(s) Mapping

Review of Probability Random Variables

Review of Probability Random Variables (Con’t) Example: Consider again the experiment of drawing a single card from a standard deck of 52 cards. Suppose that we define the following events. A: a heart; B: a spade; C: a club; and D: a diamond, so that S = {A, B, C, D}. A random variable is easily defined by letting x = 1 represent event A, x = 2 represent event B, and so on. Sample space AB C D R.V space

Review of Probability Random Variables (Con’t) Example: Consider the experiment of throwing a single die and observing the value of the up-face. We can define a random variable as the numerical outcome of the experiment (i.e., 1 through 6), but there are many other possibilities. For example, a binary random variable could be defined simply by letting x = 0 represent the event that the outcome of throw is an even number and x = 1 otherwise.

Review of Probability Random Variables (Con’t) Note the important fact in the examples just given that the probability of the events have not changed; all a random variable does is map events onto the real line.

Review of Probability Random Variables (Con’t)

Review of Probability Random Variables (Con’t)

Review of Probability Random Variables (Con’t)  Thus far we have been concerned with random variables whose values are discrete. To handle continuous random variables we need some additional tools. In the discrete case, the probabilities of events are numbers between 0 and 1.  When dealing with continuous quantities (which are not denumerable) we can no longer talk about the "probability of an event" because that probability is zero. This is not as unfamiliar as it may seem.  For example, given a continuous function we know that the area of the function between two limits a and b is the integral from a to b of the function. However, the area at a point is zero because the integral from,say, a to a is zero. We are dealing with the same concept in the case of continuous random variables.

Review of Probability Random Variables (Con’t) Thus, instead of talking about the probability of a specific value, we talk about the probability that the value of the random variable lies in a specified range. In particular, we are interested in the probability that the random variable is less than or equal to (or, similarly, greater than or equal to) a specified constant a. We write this as If this function is given for all values of a (i.e.,   < a <  ), then the values of random variable x have been defined. Function F is called the cumulative probability distribution function or simply the cumulative distribution function (cdf). The shortened term distribution function also is used.

Review of Probability Random Variables (Con’t) Example of CDF

Review of Probability Random Variables (Con’t) Due to the fact that it is a probability, the cdf has the following properties: where x + = x + , with  being a positive, infinitesimally small number.

Review of Probability Random Variables (Con’t) The probability density function (pdf) of random variable x is defined as the derivative of the cdf: The term density function is commonly used also.

Review of Probability Random Variables (Con’t) The probability density function (pdf) of random variable x is defined as the derivative of the cdf:

Review of Probability Random Variables (Con’t) The probability density function (pdf) of random variable x is defined as the derivative of the cdf: The term density function is commonly used also. The pdf satisfies the following properties:

Review of Probability Random Variables (Con’t) Example of PDF and CDF

Review of Probability Random Variables (Con’t) Example of PDF and CDF

Review of Probability Random Variables (Con’t) The preceding concepts are applicable to discrete random variables. In this case, there is a finite no. of events and we talk about probabilities, rather than probability density functions. Integrals are replaced by summations and, sometimes, the random variables are subscripted. For example, in the case of a discrete variable with N possible values we would denote the probabilities by P(x i ), i=1, 2,…, N.

Review of Probability Random Variables (Con’t) In Sec. 3.3 of the book we used the notation p(r k ), k = 0,1,…, L - 1, to denote the histogram of an image with L possible gray levels, r k, k = 0,1,…, L - 1, where p(r k ) is the probability of the kth gray level (random event) occurring. The discrete random variables in this case are gray levels. It generally is clear from the context whether one is working with continuous or discrete random variables, and whether the use of subscripting is necessary for clarity. Also, uppercase letters (e.g., P) are frequently used to distinguish between probabilities and probability density functions (e.g., p) when they are used together in the same discussion.

Practice Problem: The number of patients seen in the ER in any given hour is a random variable represented by x. The probability distribution for x is: x P(x) Find the probability that in a given hour: a. exactly 14 patients arrive b. At least 12 patients arrive c. At most 11 patients arrive p(x=14)=.1 p(x  12)= ( ) =.4 p(x≤11)= (.4 +.2) =.6

Review Question 1 If you toss a die, what’s the probability that you roll a 3 or less? a.1/6 b.1/3 c.1/2 d.5/6 e.1.0

Review Question 1 If you toss a die, what’s the probability that you roll a 3 or less? a.1/6 b.1/3 c.1/2 d.5/6 e.1.0

Review Question 2 Two dice are rolled and the sum of the face values is six? What is the probability that at least one of the dice came up a 3? a.1/5 b.2/3 c.1/2 d.5/6 e.1.0

Review Question 2 Two dice are rolled and the sum of the face values is six. What is the probability that at least one of the dice came up a 3? a.1/5 b.2/3 c.1/2 d.5/6 e.1.0 How can you get a 6 on two dice? 1-5, 5-1, 2-4, 4-2, 3-3 One of these five has a 3.  1/5

Continuous case  The probability function that accompanies a continuous random variable is a continuous mathematical function that integrates to 1.  For example, recall the negative exponential function (in probability, this is called an “ exponential distribution ” ):  This function integrates to 1:

Continuous case: “ probability density function ” (pdf) x p(x)=e -x 1 The probability that x is any exact particular value (such as ) is 0; we can only assign probabilities to possible ranges of x.

For example, the probability of x falling within 1 to 2: x p(x)=e -x 1 12 Clinical example: Survival times after lung transplant may roughly follow an exponential function. Then, the probability that a patient will die in the second year after surgery (between years 1 and 2) is 23%.

Example 2: Uniform distribution The uniform distribution: all values are equally likely. f(x)= 1, for 1  x  0 x p(x) 1 1 We can see it’s a probability distribution because it integrates to 1 (the area under the curve is 1):

Example: Uniform distribution What’s the probability that x is between 0 and ½? P(½  x  0)= ½ Clinical Research Example: When randomizing patients in an RCT, we often use a random number generator on the computer. These programs work by randomly generating a number between 0 and 1 (with equal probability of every number in between). Then a subject who gets X.5 is treatment. x p(x) 1 1 ½ 0

Expected value of a random variable Expected value is just the average or mean (µ) of random variable x. It’s sometimes called a “weighted average” because more frequent values of X are weighted more highly in the average. It’s also how we expect X to behave on-average over the long run (“frequentist” view again).

Expected value, formally Discrete case: Continuous case:

Symbol Interlude E(X) = µ these symbols are used interchangeably

Example: expected value Recall the following probability distribution of ER arrivals: x P(x).4.2.1

Sample Mean is a special case of Expected Value… Sample mean, for a sample of n subjects: = The probability (frequency) of each person in the sample is 1/n.

Expected Value Expected value is an extremely useful concept for good decision-making!

Example: the lottery The Lottery (also known as a tax on people who are bad at math … ) A certain lottery works by picking 6 numbers from 1 to 49. It costs $1.00 to play the lottery, and if you win, you win $2 million after taxes. If you play the lottery once, what are your expected winnings or losses?

Lottery x$p(x) million7.2 x Calculate the probability of winning in 1 try: The probability function (note, sums to 1.0): “49 choose 6” Out of 49 numbers, this is the number of distinct combinations of 6.

Expected Value x$p(x) million7.2 x The probability function Expected Value E(X) = P(win)*$2,000,000 + P(lose)*-$1.00 = 2.0 x 10 6 * 7.2 x (-1) = = -$.86 Negative expected value is never good! You shouldn’t play if you expect to lose money!

Expected Value If you play the lottery every week for 10 years, what are your expected winnings or losses? 520 x (-.86) = -$447.20

Gambling (or how casinos can afford to give so many free drinks…)

A roulette wheel has the numbers 1 through 36, as well as 0 and 00. If you bet $1 that an odd number comes up, you win or lose $1 according to whether or not that event occurs. If random variable X denotes your net gain, X=1 with probability 18/38 and X= -1 with probability 20/38. E(X) = 1(18/38) – 1 (20/38) = -$.053 On average, the casino wins (and the player loses) 5 cents per game. The casino rakes in even more if the stakes are higher: E(X) = 10(18/38) – 10 (20/38) = -$.53 If the cost is $10 per game, the casino wins an average of 53 cents per game. If 10,000 games are played in a night, that’s a cool $5300.

Expected value isn’t everything though… Take the hit new show “Deal or No Deal” Everyone know the rules? Let’s say you are down to two cases left. $1 and $400,000. The banker offers you $200,000. So, Deal or No Deal?

Deal or No Deal… This could really be represented as a probability distribution and a non- random variable: x$p(x) $400, x$p(x) +$200,0001.0

Expected value doesn’t help… x$p(x) $400, x$p(x) +$200,0001.0

How to decide? Variance! If you take the deal, the variance/standard deviation is 0. If you don’t take the deal, what is average deviation from the mean? What’s your gut guess?

Variance/standard deviation  2 =Var(x) =E(x-  ) 2 “ The expected (or average) squared distance (or deviation) from the mean ”

Variance, continuous Discrete case: Continuous case?:

Symbol Interlude Var(X)=  2 SD(X) =  these symbols are used interchangeably

Similarity to empirical variance The variance of a sample: s 2 = Division by n-1 reflects the fact that we have lost a “degree of freedom” (piece of information) because we had to estimate the sample mean before we could estimate the sample variance.

Variance Now you examine your personal risk tolerance…

Practice Problem On the roulette wheel, X=1 with probability 18/38 and X= -1 with probability 20/38. We already calculated the mean to be = - $.053. What’s the variance of X?

Answer Standard deviation is $.99. Interpretation: On average, you’re either 1 dollar above or 1 dollar below the mean, which is just under zero. Makes sense!

Review Question 3 The expected value and variance of a coin toss (H=1, T=0) are? a..50,.50 b..50,.25 c..25,.50 d..25,.25

Review Question 3 The expected value and variance of a coin toss are? a..50,.50 b..50,.25 c..25,.50 d..25,.25

Review Random variable Coin flip experiment X = 0X = 1 X: Random variable

Review Probability mass function (discrete) x 01 P(x) P(x) >= 0 Example: Coin flip experiment Any other constraints? Hint: What is the sum?

Review Probability density function (continuous) f(x) x f(x) >= 0 Examples? Unlike discrete, Density function does not represent probability but its rate of change called the “likelihood”

Review Probability density function (continuous) f(x) x f(x) >= 0 x0x0 X 0 +dx P( x 0 < x < x 0 +dx ) = f(x 0 ).dx But, P( x = x 0 ) = 0 & Integrates to 1.0

The Gaussian Distribution Courtesy:

A 2D Gaussian

Central Limit Theorem The distribution of the sum of N i.i.d. random variables becomes increasingly Gaussian as N grows. Example: N uniform [0,1] random variables.

Central Limit Theorem (Coin flip) Flip coin N times Each outcome has an associated random variable X i (=1, if heads, otherwise 0) Number of heads N H is a random variable –Sum of N i.i.d. random variables N H = x 1 + x 2 + …. + x N

Central Limit Theorem (Coin flip) Probability mass function of N H –P(Head) = 0.5 (fair coin) N = 5N = 10N = 40

Geometry of the Multivariate Gaussian

Moments of the Multivariate Gaussian (1) thanks to anti-symmetry of z

Moments of the Multivariate Gaussian (2)

Maximum likelihood Fit a probability density model p(x | θ) to the data –Estimate θ Given independent identically distributed (i.i.d.) data X = (x 1, x 2, …, x N ) –Likelihood –Log likelihood Maximum likelihood: Maximize ln p(X | θ) w.r.t. θ

Maximum Likelihood for the Gaussian (1) Given i.i.d. data, the log likelihood function is given by Sufficient statistics

Maximum Likelihood for the Gaussian (2) Set the derivative of the log likelihood function to zero, and solve to obtain Similarly

Mixtures of Gaussians (1) Old Faithful data set Single GaussianMixture of two Gaussians

Mixtures of Gaussians (2) Combine simple models into a complex model: Component Mixing coefficient K=3

Mixtures of Gaussians (3)

Mixtures of Gaussians (4) Determining parameters ¹, §, and ¼ using maximum log likelihood Solution: use standard, iterative, numeric optimization methods or the expectation maximization algorithm (Chapter 9). Log of a sum; no closed form maximum.

Important discrete probability distribution: The binomial

Binomial Probability Distribution A fixed number of observations (trials), n e.g., 15 tosses of a coin; 20 patients; 1000 people surveyed A binary outcome e.g., head or tail in each toss of a coin; disease or no disease Generally called “success” and “failure” Probability of success is p, probability of failure is 1 – p Constant probability for each observation e.g., Probability of getting a tail is the same each time we toss the coin

Binomial distribution Take the example of 5 coin tosses. What’s the probability that you flip exactly 3 heads in 5 coin tosses?

Binomial distribution Solution: One way to get exactly 3 heads: HHHTT What’s the probability of this exact arrangement? P(heads)xP(heads) xP(heads)xP(tails)xP(tails) =(1/2) 3 x (1/2) 2 Another way to get exactly 3 heads: THHHT Probability of this exact outcome = (1/2) 1 x (1/2) 3 x (1/2) 1 = (1/2) 3 x (1/2) 2

Binomial distribution In fact, (1/2) 3 x (1/2) 2 is the probability of each unique outcome that has exactly 3 heads and 2 tails. So, the overall probability of 3 heads and 2 tails is: (1/2) 3 x (1/2) 2 + (1/2) 3 x (1/2) 2 + (1/2) 3 x (1/2) 2 + ….. for as many unique arrangements as there are—but how many are there??

OutcomeProbability THHHT(1/2) 3 x (1/2) 2 HHHTT (1/2) 3 x (1/2) 2 TTHHH (1/2) 3 x (1/2) 2 HTTHH(1/2) 3 x (1/2) 2 HHTTH(1/2) 3 x (1/2) 2 HTHHT(1/2) 3 x (1/2) 2 THTHH(1/2) 3 x (1/2) 2 HTHTH(1/2) 3 x (1/2) 2 HHTHT(1/2) 3 x (1/2) 2 THHTH(1/2) 3 x (1/2) 2 10 arrangements x (1/2) 3 x (1/2) 2 The probability of each unique outcome (note: they are all equal ) ways to arrange 3 heads in 5 trials 5 C 3 = 5!/3!2! = 10 Factorial review: n! = n(n-1)(n-2)…

 P(3 heads and 2 tails) = x P(heads) 3 x P(tails) 2 = 10 x (½) 5= 31.25%

x p(x) Binomial distribution function: X= the number of heads tossed in 5 coin tosses number of heads p(x) number of heads

Binomial distribution, generally 1-p = probability of failure p = probability of success X = # successes out of n trials n = number of trials Note the general pattern emerging  if you have only two possible outcomes (call them 1/0 or yes/no or success/failure) in n independent trials, then the probability of exactly X “successes”=

Binomial distribution: example If I toss a coin 20 times, what’s the probability of getting exactly 10 heads?

Binomial distribution: example If I toss a coin 20 times, what’s the probability of getting of getting 2 or fewer heads?

**All probability distributions are characterized by an expected value and a variance: If X follows a binomial distribution with parameters n and p: X ~ Bin (n, p) Then: E(X) = np Var (X) = np(1-p) SD (X)= Note: the variance will always lie between 0*N-.25 *N p(1-p) reaches maximum at p=.5 P(1-p)=.25

Practice Problem 1. You are performing a cohort study. If the probability of developing disease in the exposed group is.05 for the study duration, then if you (randomly) sample 500 exposed people, how many do you expect to develop the disease? Give a margin of error (+/- 1 standard deviation) for your estimate. 2. What’s the probability that at most 10 exposed people develop the disease?

Answer 1. How many do you expect to develop the disease? Give a margin of error (+/- 1 standard deviation) for your estimate. X ~ binomial (500,.05) E(X) = 500 (.05) = 25 Var(X) = 500 (.05) (.95) = StdDev(X) = square root (23.75) = 4.87  25  4.87

Answer 2. What’s the probability that at most 10 exposed subjects develop the disease? This is asking for a CUMULATIVE PROBABILITY: the probability of 0 getting the disease or 1 or 2 or 3 or 4 or up to 10. P(X≤10) = P(X=0) + P(X=1) + P(X=2) + P(X=3) + P(X=4)+….+ P(X=10)=

Practice Problem: You are conducting a case-control study of smoking and lung cancer. If the probability of being a smoker among lung cancer cases is.6, what’s the probability that in a group of 8 cases you have: a. Less than 2 smokers? b. More than 5? c. What are the expected value and variance of the number of smokers?

Answer

Answer, continued E(X) = 8 (.6) = 4.8 Var(X) = 8 (.6) (.4) =1.92 StdDev(X) = 1.38 P(<2)= = P(>5)= =.3168

Review Question 4 In your case-control study of smoking and lung-cancer, 60% of cases are smokers versus only 10% of controls. What is the odds ratio between smoking and lung cancer? a. 2.5 b c d. 6.0 e..05

Review Question 4 In your case-control study of smoking and lung-cancer, 60% of cases are smokers versus only 10% of controls. What is the odds ratio between smoking and lung cancer? a. 2.5 b c d. 6.0 e..05

Review Question 5 What’s the probability of getting exactly 5 heads in 10 coin tosses? a. b. c. d.

Review Question 5 What’s the probability of getting exactly 5 heads in 10 coin tosses? a. b. c. d.

Review Question 6 A coin toss can be thought of as an example of a binomial distribution with N=1 and p=.5. What are the expected value and variance of a coin toss? a..5,.25 b. 1.0, 1.0 c. 1.5,.5 d..25,.5 e..5,.5

Review Question 6 A coin toss can be thought of as an example of a binomial distribution with N=1 and p=.5. What are the expected value and variance of a coin toss? a..5,.25 b. 1.0, 1.0 c. 1.5,.5 d..25,.5 e..5,.5

Review Question 7 If I toss a coin 10 times, what is the expected value and variance of the number of heads? a. 5, 5 b. 10, 5 c. 2.5, 5 d. 5, 2.5 e. 2.5, 10

Review Question 7 If I toss a coin 10 times, what is the expected value and variance of the number of heads? a. 5, 5 b. 10, 5 c. 2.5, 5 d. 5, 2.5 e. 2.5, 10

Review Question 8 In a randomized trial with n=150, the goal is to randomize half to treatment and half to control. The number of people randomized to treatment is a random variable X. What is the probability distribution of X? a. X~Normal(  =75,  =10) b. X~Exponential(  =75) c. X~Uniform d. X~Binomial(N=150, p=.5) e. X~Binomial(N=75, p=.5)

Review Question 8 In a randomized trial with n=150, every subject has a 50% chance of being randomized to treatment. The number of people randomized to treatment is a random variable X. What is the probability distribution of X? a. X~Normal(  =75,  =10) b. X~Exponential(  =75) c. X~Uniform d. X~Binomial(N=150, p=.5) e. X~Binomial(N=75, p=.5)

Review Question 9 In the same RCT with n=150, if 69 end up in the treatment group and 81 in the control group, how far off is that from expected? a. Less than 1 standard deviation b. 1 standard deviation c. Between 1 and 2 standard deviations d. More than 2 standard deviations

Review Question 9 In the same RCT with n=150, if 69 end up in the treatment group and 81 in the control group, how far off is that from expected? a. Less than 1 standard deviation b. 1 standard deviation c. Between 1 and 2 standard deviations d. More than 2 standard deviations Expected = and 69 are both 6 away from the expected. Variance = 150(.25) = 37.5 Std Dev  6 Therefore, about 1 SD away from expected.

Proportions… The binomial distribution forms the basis of statistics for proportions. A proportion is just a binomial count divided by n. For example, if we sample 200 cases and find 60 smokers, X=60 but the observed proportion=.30. Statistics for proportions are similar to binomial counts, but differ by a factor of n.

Stats for proportions For binomial: For proportion: P-hat stands for “sample proportion.” Differs by a factor of n.

It all comes back to normal… Statistics for proportions are based on a normal distribution, because the binomial can be approximated as normal if np>5