Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.

Similar presentations


Presentation on theme: "Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee."— Presentation transcript:

1 Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee

2 1.1 INTRODUCTION Every experiment has one or more outcomes. Random experiment > single outcome cannot be predicted but outcomes over a long period of time follow a certain rule. Sample Space > collection of every possible outcome. – Denoted by C. – Let c denote an element in C and let C represent a collection of elements of C.

3 Examples: Toss a coin, toss 2 coins, throw a die. Write sample spaces. We define f/N as the relative frequency of event C in N performances. As N increases, f/N -> p Here p is the number which, in future performances of the experiment the relative frequency of the event C will approximately equal. ‘p’ is called the probability of event C.

4 Note: – Above idea works only if experiment can be repeated in identical conditions. – But sometimes p is personal or subjective. – For both mathematical theory is the same.

5 1.2 SET THEORY Set -> Collection of particular objects. – Notation: c is an element if set C, One-one correspondence – Every member of A is paired with exactly one member of B. No two ordered pairs have the same first element or the same second element.

6 Definitions If each element C 1 is also an element of C 2, the set C 1 is called subset of set C 2. Example C 1 ={x: 0≤x ≤1} and C 2 ={x: -1 ≤x ≤2} then C 1 is the subset of C 2.

7 Null set: If C has no elements, C = Φ. Union: Set of all elements that belong to at least one of the sets C 1 and C 2 is called union of C 1 and C 2. Notation C 1 U C 2. Intersection: Set of all elements that belong to each of the sets C 1 and C 2 is called intersection of C 1 and C 2. Denoted by C 1 C 2.

8 Complement: The set that consists of all elements of C that are not elements of C is called the complement of C. We will denote complement of C as C c. De Morgan’s Laws Let us prove the laws and do some examples for union, intersection, complement with – C 1 ={x: 0≤x ≤1} and C 2 ={x: -1 ≤x ≤2}

9 Let C be a set in one-dimensional space and let Q(C) be equal to the number of points in C which correspond to positive integers. Then Q(C) is a function of the set C. – Example C = {x, 0 < x< 5}, then Q(C)=4 – C = {-1, -2}, Q(C) = 0 Let C be a set in one-dimensional space and let Q(C)=∑f(x) over C or – Note: f(x) has to be chosen with care or else the integral may fail to exist.

10 Examples

11 1.3 THE PROBABILITY SET FUNCTION σ – Field: Let B be a collection of subsets of C. We say B is a σ-field if – B is not empty. – If – If the sequence of sets {C 1, C 2, ….}is in B then – Example B = {C, C c, Φ, C} is a σ-field. – σ-field is also called the Borel σ-field.

12 Probability definition Let C be a sample space and let B be a σ-field on C. Let P be a real valued function defined on B. Then P is a probability set function if P satisfies the following three conditions. 1.P(C)≥ 0 2.P(C) = 1 3.If {C n } is a sequence of sets in B and

13 Theorems-work on proofs

14 Inequalities Booles’s inequality Bonferoni’s Inequality

15 Definitions Mutually exclusive events: when two events have no elements in common. Equilikely case: If there are k events each event has probability 1/k. Permutation and Combination. – Examples from Poker hand, page 17

16 Theorems

17 1.4 CONDITIONAL PROBABILITY AND INDEPENDENCE Let C 1 and C 2 both be subsets of C. We want to define the probability of C 2 relative to C 1. This we call conditional probability, denoted by P(C 2 | C 1 ). Since C 1 is the sample space now, the only elements of interest are the ones in both C 1 and C 2.

18 Properties of P(C 2 | C 1 ) Note P(C 2 | C 1 )≥0 Note P(C 1 | C 1 ) = 1

19 Examples Example: A hand of 5 cards is to be dealt at random without replacement from an ordinary deck of cards. Find the probability of an all-spade hand given that there are at least 4 spades in the hand. From an ordinary deck of playing cards, cards are to be drawn successively at random without replacement. Find the probability that the third spade appears on the sixth draw.

20 Bayes Theorem Law of total probability – Assume that events C 1, ….C k are mutually exclusive and exhaustive events. Let C be another event, C occurs with one and only one of the events. Bayes Theorem- To find the conditional probability of C j given C.

21 Bayes Theorem The probabilities P(C i ) are called prior probabilities. The probabilities P(C i | C) are called posterior probabilities. Lets work through example 1.4.5, page 25 from the text book.

22 Independent Events Let C 1 and C 2 be two events. We say that C 1 and C 2 are independent if Suppose we have three events, then the events are mutually independent if and only if – They are pair wise independent and Lets work through example 1.4.10, page 29.

23 1.5 RANDOM VARIABLES Consider a random experiment with a sample space C. A function X which assigns to each element c (c belongs to the sample space) one and only one number X(c) = x, is a random variable. Notation: – B: any event – D: a new sample space – {d i }:simple events which belong to D

24 PMF Here P x is completely determined by the function The function p x (d i ) is called the probability mass function (pmf). Example: Write the pmf for X: sum of the upfaces on a roll of a pair of 6-sideddice. Compute probability for event B={x: x=7,11}

25 Cumulative Distribution Function Let X be a random variable. Then its cdf is defined as Continuing with the previous example, define the cdf of X. For a continuous random variable, the probability density function can be obtained as

26 Note If X and Y are two random variables and F X (x)=F y (y) for all x in R, then X and Y are equal in distribution and it is denoted as Though X and Y may be equal in distribution, X and Y itself may be quite different. See example in book.

27 Theorems Let X be a random variable with cumulative distribution function F(x). Then Let X be a rv with cdf F x. Then for a < b, P[a<X≤b]=F x (b)-F x (a)

28 Theorem For any random variable Lets work through the proof.

29 1.6 DISCRETE RANDOM VARIABLES We say a random variable is a discrete rv if its space is either finite or countable. Example: – Consider a sequence of independent flips of a count. Let X be the number of flips required to obtain the first head. What is the pmf for X? – Five fuses are chosen from a lot of 100 fuses. The lot contains 20 defective fuses. Let X be the number of non-defective fuses, what is the pmf for X?

30 Properties of pmf

31 Transformations Suppose we are interested in some random variable Y, which is some transformation of X, say Y=g(X). Assume X is discrete with space D x. Then the space of Y is D y ={g(x);x belongs to D x }. Y can be obtained as

32 Example Let X have the pmf Find the pmf of Y where Y = X 2

33 1.7 CONTINUOUS RANDOM VARIABLE We say a random variable is a continuous random variable if its cumulative distribution function is a continuous function for all x belong to R. We define the cdf as The probability density function can be defined as

34 Note

35 Example Let the random variable be the time in seconds between incoming telephone calls at a busy switchboard. Suppose that a reasonable pdf for X is given below. Find P(X>4).

36 Transformation Let X be a continuous random variable with pdf f X (x) and support S. Let Y =g(X), where g(x) is a one-to-one differentiable function, on the support of X, S. Denote the inverse of g by x=g - 1 (y) and let dx/dy =d[g -1 (y)]/dy. Then the pdf of Y is given by

37 Example Let X have the pdf given below. Find the pdf for Y where Y=-2logX. – f(x)=1 for 0<x<1

38 1.8 EXPECTATION OF A RANDOM VARIABLE Let X be a random variable. If X is a continuous random variable with pdf f(x) and

39 Example for a discrete RV Let the random variable X of the discrete type have the pmf given by the table below. Find E(X). X1234 P(X)4/101/103/102/10

40 Theorem Expectation of a constant is a constant itself, that is E(k)=k. Let X be a random variable and let Y=g(X) for some function g. – Suppose X is continuous with pdf f(x).

41 Theorem Let g 1 (X) and g 2 (X) be the functions of a random variable X. Suppose the expectations of g 1 (X) and g 2 (X) exist. Then for any constants k 1 and k 2, the expectation of k 1 g 1 (X) + k 2 g 2 (X) exists and is given – E[k 1 g 1 (X) + k 2 g 2 (X)]= k 1 E( g 1 (X)) + k 2 E(g 2 (X) )

42 1.9 SOME SPECIAL EXPECTATIONS Some expectations have special names. – E(X) is called the mean value of X. – It is denoted by µ. – The mean is the first moment (about 0) of a random variable.

43 Variance Let X be a random variable with finite mean µ and such that E[(x- µ) 2 ] is finite. Then the variance of X is defined to be E[(x- µ) 2 ]. It is usually denoted by σ 2 or by Var(X). – We can write – Here σ is called the standard deviation of X. Measures the dispersion of the points in the space relative to the mean.

44 Moment Generating Function Let X be a random variable such that h>0, the expectation of e tx exists for –h < t < h. The moment generating function (mgf) of X is defined to be the function M(t)=E(e tx ) for – h<t<h. If X and Y have the same distribution, them they have the same mgf in a neighborhood of 0. The converse is also true.

45 Note


Download ppt "Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee."

Similar presentations


Ads by Google