Physics 114: Lecture 9 Probability Density Functions Dale E. Gary NJIT Physics Department.

Slides:



Advertisements
Similar presentations
Lecture Discrete Probability. 5.3 Bayes’ Theorem We have seen that the following holds: We can write one conditional probability in terms of the.
Advertisements

Acknowledgement: Thanks to Professor Pagano
Probability Distributions CSLU 2850.Lo1 Spring 2008 Cameron McInally Fordham University May contain work from the Creative Commons.
Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula.
CHAPTER 13: Binomial Distributions
Physics 114: Lecture 7 Uncertainties in Measurement Dale E. Gary NJIT Physics Department.
Chapter 5 Basic Probability Distributions
Prof. Bart Selman Module Probability --- Part d)
Probability and Probability Distributions
Discrete Probability Distributions
Discrete Random Variables: The Binomial Distribution
Problem A newly married couple plans to have four children and would like to have three girls and a boy. What are the chances (probability) their desire.
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Stat 1510: Introducing Probability. Agenda 2  The Idea of Probability  Probability Models  Probability Rules  Finite and Discrete Probability Models.
Chapter 5 Sampling Distributions
Chapter 6 The Normal Probability Distribution
Physics 114: Lecture 10 PDFs Part Deux Dale E. Gary NJIT Physics Department.
1 Let X represent a Binomial r.v,Then from => for large n. In this context, two approximations are extremely useful. (4-1) 4. Binomial Random Variable.
5.5 Distributions for Counts  Binomial Distributions for Sample Counts  Finding Binomial Probabilities  Binomial Mean and Standard Deviation  Binomial.
Binomial Distributions Calculating the Probability of Success.
Overview 6.1 Discrete Random Variables
Permutations & Combinations and Distributions
Theory of Probability Statistics for Business and Economics.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 11 Section 1 – Slide 1 of 34 Chapter 11 Section 1 Random Variables.
Sampling distributions - for counts and proportions IPS chapter 5.1 © 2006 W. H. Freeman and Company.
Physics 114: Exam 2 Review Lectures 11-16
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Introduction to Behavioral Statistics Probability, The Binomial Distribution and the Normal Curve.
Pascal’s triangle - A triangular arrangement of where each row corresponds to a value of n. VOCAB REVIEW:
Copyright © 2014 Pearson Education, Inc. All rights reserved Chapter 6 Modeling Random Events: The Normal and Binomial Models.
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology.
Discrete Random Variables
Probability Distributions
Physics 270. o Experiment with one die o m – frequency of a given result (say rolling a 4) o m/n – relative frequency of this result o All throws are.
Discrete Distributions. Random Variable - A numerical variable whose value depends on the outcome of a chance experiment.
CY1B2 Statistics1 (ii) Poisson distribution The Poisson distribution resembles the binomial distribution if the probability of an accident is very small.
Probability Distributions, Discrete Random Variables
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U.
Check it out! : Standard Normal Calculations.
Probability Distributions and Expected Value Chapter 5.1 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U Authors:
Random Variables Lecture Lecturer : FATEN AL-HUSSAIN.
Chapter 4 Discrete Random Variables and Probability Distributions
THE NORMAL DISTRIBUTION
Managerial Economics & Decision Sciences Department random variables  density functions  cumulative functions  business analytics II Developed for ©
Theoretical distributions: the Normal distribution.
Ch 9 實習.
Binomial and Geometric Random Variables
CHAPTER 14: Binomial Distributions*
Physics 114: Lecture 8 PDFs Part Deux
Physics 114: Lecture 7 Probability Density Functions
Box models Coin toss = Head = Tail 1 1
Chapter 5 Sampling Distributions
Chapter 5 Sampling Distributions
Chapter 5 Sampling Distributions
Chapter 16.
Chapter 5 Sampling Distributions
Binomial Distribution Prof. Welz, Gary OER –
Probability Theory and Specific Distributions (Moore Ch5 and Guan Ch6)
Chapter 5 Sampling Distributions
Discrete Distributions
Discrete Distributions
Continuous Random Variable Normal Distribution
Lecture 2 Binomial and Poisson Probability Distributions
Discrete Distributions.
Ch 9 實習.
CHAPTER 2.1 PROBABILITY DISTRIBUTIONS.
CHAPTER 2.1 PROBABILITY DISTRIBUTIONS.
Presentation transcript:

Physics 114: Lecture 9 Probability Density Functions Dale E. Gary NJIT Physics Department

February 12, 2010 Binomial Distribution  If you raise the sum of two variables to a power, you get:  Writing only the coefficients, you begin to see a pattern:

February 12, 2010 Binomial Distribution  Remarkably, this pattern is also the one that governs the possibilities of tossing n coins :  With 3 coins, there are 8 ways for them to land, as shown above.  In general, there are 2 n possible ways for n coins to land.  How many permutations are there for a given row, above, e.g. how many permutations for getting 1 head and 2 tails? Obviously, 3.  How many permutation for x heads and n  x tails, for general n and x ? n 2 n Number of combinations in each row: (n choose x)

February 12, 2010 Probability  With fair coins, tossing a coin will result in equal chance of 50%, or ½, of its ending up heads. Let us call this probability p. Obviously, the probability of tossing a tails, q, is q = (1  p).  With 3 coins, the probability of getting any single one of the combinations is 1/2 n = 1/8 th, (since there are 8 combinations, and each is equally probable). This comes from (½) (½) (½), or the product of each probability p = ½ to get a heads.  If we want to know the probability of getting, say 1 heads and 2 tails, we just need to multiply the probability of any combination (1/8 th ) by the number of ways of getting 1 heads and 2 tails, i.e. 3, for a total probability of 3/8.  To be really general, say the coins were not fair, so p ≠ q. Then the probability to get heads, tails, tails would be (p)(q)(q) = p 1 q 2.  Finally the probability P(x; n, p) of getting x heads given n coins each of which has probability p, is  With 3 coins, there are 8 ways for them to land, as shown above.  In general, there are 2 n possible ways for n coins to land.  How many permutations are there for a given row, above, e.g. how many permutations for getting 1 head and 2 tails? Obviously, 3.  How many permutation for x heads and n  x tails, for general n and x ?

February 12, 2010 Binomial Distribution  This is the binomial distribution, which we write P B :  Let’s see if it works. For 1 heads with a toss of 3 fair coins, x = 1, n = 3, p = ½, we get  For no heads, and all tails, we get  Say the coins are not fair, but p = ¼. Then the probability of 2 heads and 1 tails is:  You’ll show for homework that the sum of all probabilities for this (and any) case is 1, i.e. the probabilities are normalized. Note: 0!  1

February 12, 2010 Binomial Distribution  To see the connection of this to the sum of two variables raised to a power, replace a and b with p and q :  Since p + q = 1, each of these powers also equals one on the left side, while the right side expresses how the probabilities are split among the different combinations. When p = q = ½, for example, the binomial triangle becomes  In MatLAB, use binopdf(x,n,p) to calculate one row of this triangle, e.g. binopdf(0:3,3,0.5) prints 0.125, 0.375, 0.375, / 2 1 / 2 1 / 4 2 / 4 1 / 4 1 / 8 3 / 8 3 / 8 1 / 8 1 / 16 4 / 16 6 / 16 4 / 16 1 / 16

February 12, 2010 Binomial Distribution  Let’s say we toss 10 coins, and ask how many heads we will see. The 10 th row of the triangle would be plotted as at right.  The binomial distribution applies to yes/no cases, i.e. cases where you want to know the probability of something happening, vs. it not happening.  Say we want to know the probability of getting a 1, rolling five 6-sided dice. Then p = 1/6 (the probability of rolling a 1 on one die), and q = 1 – p = 5/6 (the probability of NOT rolling a 1). The binomial distribution applies to this case, with P B (x,5,1/6). The plot is shown at right. >> binopdf(0:5,5,1/6.) ans =

February 12, 2010 Binomial Distribution Mean  Let’s say we toss 10 coins N = 100 times. Then we would multiple the PDF by N, to find out how many times we would have x number of heads.  The mean of the distribution is, as before:  For 10 coins, with p = ½, we get  = np = 5.  For 5 dice, with p = 1/6, we get  = np = 5/  

February 12, 2010 Binomial Standard Deviation  The standard deviation of the distribution is the “second moment,” given by the variance:  For 10 coins, with p = ½, we get  For 5 dice, with p = 1/6, we get    

February 12, 2010 Summary of Binomial Distribution  The binomial distribution is P B :  The mean is  The standard deviation is

February 12, 2010 Poisson Distribution  An approximation to the binomial distribution is very useful for the case where n is very large (i.e. rolls with a die with infinite number of sides?) and p is very small—called the Poisson distribution.  This is the case of counting experiments, such as the decay of radioactive material, or measuring photons in low light level.  To derive it, start with the binomial distribution with n large and p << 1, but with a well defined mean  = np. Then  The term because x is small, so most of the terms cancel leaving a total of x terms each approximately equal to n.  This gives

February 12, 2010 Poisson Distribution  Now, the term (1 – p)  x  1, for small p, and with some algebra we can show that the term (1 – p) n  e .  Thus, the final Poisson distribution depends only on x and , and is defined as  The text shows that the expectation value of x (i.e. the mean) is  Remarkably, the standard deviation is given by the second moment as  These are a little tedious to prove, but all we need for now is to know that the standard deviation is the square-root of the mean.

February 12, 2010 Example 2.3  Some students measure some background counts of cosmic rays. They recorded numbers of counts in their detector for a series of s intervals, and found a mean of 1.69 counts/interval. They can use the standard deviation formula from chapter 1, which is to get a standard deviation directly from the data. They do this and get s = They can also estimate the standard deviation by  Now they change the length of time they count from 2-s intervals to 15-s intervals. Now the mean number of counts in each interval will increase. Now they measure a mean of 11.48, which implies while they again calculate s directly from their measurements to find s =  We can plot the theoretical distributions using MatLAB poisspdf(x,mu), e.g. poisspdf(0:8,1.69) gives ans =

February 12, 2010 Example 2.3, cont’d  The plots of the distributions is shown for these two cases in the plots at right.  You can see that for a small mean, the distribution is quite asymmetrical. As the mean increases, the distribution becomes somewhat more symmetrical (but is still not symmetrical at counts/interval).  I have overplotted the mean and standard deviation. You can see that the mean does not coincide with the peak (the most probable value).    

February 12, 2010 Example 2.3, cont’d  Here is the higher-mean plot with the equivalent Gaussian (normal distribution) overlaid.  For large means (high counts), the Poisson distribution approaches the Gaussian distribution, which we will describe further next time.