Presentation is loading. Please wait.

Presentation is loading. Please wait.

MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.

Similar presentations


Presentation on theme: "MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY."— Presentation transcript:

1 MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr 14/10/2011Lecture 3 OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE 04 Random Variables Fall 2011

2 What is a random variable Random Variables – A random variable X is not a “variable” like algebra – A random variable X is a function: From a set of outcomes of a random event (the sample space S of an experiment) To the set of real numbers Realizations of a random variable are called random variates.random variates 4/10/2011Lecture 32 1 Set of outcomes of a coin toss Random Variable ℝ heads tails

3 Example Experiment: throw 3 coins Sample Space: S = {(H,H,H), (H,H,T), (H,T,H), (T,H,H), (H, T, T), (T,H,T), (T,T,H),(T,T,T)} Y is a random variable, giving the number of heads that landed: 4/10/2011Lecture 33 (H,H,H) (H,H,T) (H,T,H) (T,H,H) (H,T,T) (T,H,T) (T,T,H) (T,T,T) 3 2 1 0

4 4/10/2011Lecture 34 Three balls are to be randomly selected without replacement from an urn containing 20 balls numbered 1 through 20. If we bet that at least one of the balls that are drawn has a number as large as or larger than 17, what is the probability that we win the bet? Let X be the largest of the three numbers drawn.

5 4/10/2011Lecture 35 Independent trials consisting of the flipping of a coin having probability p of coming up heads are continually performed until either a head occurs or a total of n flips is made. If we let X denote the number of times the coin is flipped, then X is a random variable taking on one of the values 1, 2, 3,..., n with respective probabilities:

6 4/10/2011Lecture 36 Three balls are randomly chosen from an urn containing 3 white, 3 red, and 5 black balls. Suppose that we win $1 for each white ball selected and lose $1 for each red ball selected. If we let X denote our total winnings from the experiment, then X is a random variable taking on the possible values 0, 1, 2, 3 with respective probabilities Suppose every ball has a number. Then your balls are: W1, W2, W3, R1, R2, R3, B1, B2, B3, B4, B5 Or for convenience I will number them from 1 to 11. So there are ways to choose three balls from this set.

7 4/10/2011Lecture 37 The list of possible values for X is {-3,-2,-1,0,1,2,3} To get -3, we must choose RRR. To get -2, we must choose 2 R and one B To get -1, we must choose 2 R and one W or one R and two B. To get 0, we must choose one R, one W and one B or BBB To get +1, we must choose 2 W and one R or one W and two B To get +2, we must choose 2W and one B To get +3, we must choose WWW. So:

8 4/10/2011Lecture 38

9 The cumulative distribution function For a random variable X, the function F defined by is called the cumulative distribution function, or, the distribution function, of X. Thus, the distribution function specifies, for all real values x, the probability that the random variable is less than or equal to x. F(x) is a nondecreasing function of x, that is, If a < b then F(a) < F(b). 4/10/2011Lecture 39

10 For the previous example: 4/10/2011Lecture 310

11 For the previous example: 4/10/2011Lecture 311

12 Probability Mass Function Is defined for a discrete variable X. 4/10/2011Lecture 312 Suppose that Then since x must be one of the values x i,

13 Example of probability mass function 4/10/2011Lecture 313

14 Example The probability mass function of a random variable X is given by i=0,1,2,… where λ is some positive value. Find (a) P{X = 0} and (b) P{X > 2}. 4/10/2011Lecture 314

15 The cumulative distribution function The cumulative distribution function F can be expressed in terms of p(a) by 4/10/2011Lecture 315 If X is a discrete random variable whose possible values are x 1, x 2, x 3, … where x 1 < x 2 < x 3 < … then the distribution function F of X is a step function.

16 Example 4/10/2011Lecture 316 then the distribution function F of X is For example, suppose the probability mass function (pmf) of X is

17 Expectation of a random variable If X is a discrete random variable having a probability mass function p(x) then the expectation or the expected value of X denoted by E[X] is defined by 4/10/2011Lecture 317 In other words, Take every possible value for X Multiply it by the probability of getting that value Add the result.

18 Examples of expectation For example, suppose you have a fair coin. You flip the coin, and define a random variable X such that – If the coin lands heads, X = 1 – If the coin lands tails, X = 2 Then the probability mass function of X is given by 4/10/2011Lecture 318 Or we can write

19 Examples of expectation Next, suppose you throw a fair die. You flip the die, and define a random variable Y such that – If the die lands a number less than or equal to 5, then Y = 0 – If the die lands a number greater than 5, then Y = 1 Then the probability mass function of Y is given by 4/10/2011Lecture 319

20 Frequency interpretation of probabilities The law of large numbers – we will see in chapter 8 – assumes that if we have an experiment (e.g. tossing a coin) and we perform it an infinite number of times, then the proportion of time that any event E occurs will be P(E). [Recall here than event means a subset of the sample space, or a set of outcomes for the experiment] So for instance suppose X is a random variable which will be equal to x 1 with probability p(x 1 ), x 2 with probability p(x 2 ), …, x n with probability p(x n ). By the frequency interpretation, if we keep playing this game, then the proportion of time that we win x i will be p(x i ). 4/10/2011Lecture 320

21 Frequency interpretation of probabilities Or we can say that when we play the game N times, where N is a very big number, we will win x i about Np(x i ) times. Then the average winnings per game will be: 4/10/2011Lecture 321

22 Example 3a Question: – Find E[X] where X is the outcome when we roll a fair die. Solution: – Since 4/10/2011Lecture 322

23 Example 3b Question: – We say that I is an indicator variable for an event A if 4/10/2011Lecture 323 – What is E[I] ?

24 Example 3d A school class of 120 students is driven in 3 buses to a symphonic performance. There are 36 students in one of the busses, 40 in another, and 44 in the third bus. When the busses arrive, one of the 120 students is randomly chosen. Let X denote the number of students on the bus of that randomly chosen student, and find E[X]. Solution: 4/10/2011Lecture 324

25 Example 3d Same problem as before, but assume that the bus is chosen randomly instead of the student, and find E[X]. Solution: 4/10/2011Lecture 325

26 Expectation of a function of a random variable To find E[g(x)], that is, the expectation of g(X) Two step process: – find the pmf of g(x) – find E[g(x)] 4/10/2011Lecture 326

27 4/10/2011Lecture 327 Let X denote a random variable that takes on any of the values –1, 0, and 1 with respective probabilities Compute Solution Let Y = X 2. Then the probability mass function of Y is given by

28 Statistics vs. Probability You may have noticed that the concept of “expectation” seems a lot like the concept of “average”. So why do we use this fancy new word “expectation”? Why not just call it “average”? We find the average of a list of numbers. The numbers are already known. We find the expectation of a random variable. We may have only one such random variable. We may only toss the coin or die once. 4/10/2011Lecture 328

29 Statistics vs. Probability For instance, let us define a random variable X using the result of a coin toss: let X = 1 if the coin lands heads, X = 0 if the coin lands tails. If we perform this experiment K times, we will get a list of values for X. We can find the average value for K by adding all the values for X, and dividing by K. 4/10/2011Lecture 329 Is this coin fair? We don’t know, but we can find out.

30 Statistics vs. Probability What we did on the previous slide was statistics: we analyzed the data to draw some conclusions about the process or mechanism (i.e. the coin) that generated that data. Probability is how we draw conclusions about the future. So suppose I did the experiments on the previous slide yesterday. Today I will come into the class and toss the coin exactly once. Then I can use the statistics from yesterday to help find out what I can expect the result of the coin toss to be today: 4/10/2011Lecture 330

31 Statistics vs. Probability Okay, so I got 0.5. What does this mean? X can never equal 0.5. Expectation makes more sense with continuous random variables, e.g. when you measure a voltage on a voltmeter. With the coin toss you can think of it this way: Suppose someone wants you to guess X. But you will pay a lot of money if you’re wrong, and the money you pay is proportional to how wrong you are. If you guess g, and the result was actually a, then you have to pay What should you guess? You must minimize If you guess g=E[X], then this penalty is minimized. 4/10/2011Lecture 331

32 Statistics: how to find the pmf of a random voltage from measurements Suppose you are going to measure a voltage. You know that the voltage is really about 5V. But you have an old voltmeter that doesn’t measure very well. The voltmeter is digital and has 1 decimal place. So you can only read voltages 0.00, 0.1, …, 4.7, 4.8, 4.9, 5.0, 5.1, …, 9.9. You start measuring the voltage. You get the following measurements: 4.7, 5.0, 4.9, 5.0, 5.3, 4.9, 4.8, 5.2, … From these measurements you can construct a probability mass function graph as follows. 4/10/2011Lecture 332

33 Pmf drawn from results of experiment 4/10/2011Lecture 333 10 4.64.54.74.84.95.05.15.25.35.45.5 1 Measurements: 4.7, 5.0, 4.9, 5.0, 5.3, 4.9, 4.8, 5.2,5.0, 4.5, 4.8, 5.1, 5.0, 5.1, 4.9, 5.3, 5.1, 5.2, 5.1, 5.4 23 4 5 6 78 9 11 12 13 14 15 1618 17 18 19 20

34 And to show this with animation 4/10/2011Lecture 334 10 4.64.54.74.84.95.05.15.25.35.45.5 1 Measurements: 4.7, 5.0, 4.9, 5.0, 5.3, 4.9, 4.8, 5.2,5.0, 4.5, 4.8, 5.1, 5.0, 5.1, 4.9, 5.3, 5.1, 5.2, 5.1, 5.4 23 4 5 6 78 9 11 12 13 14 15 1618 17 18 19 20

35 pmf derived mathematically Based on the frequency interpretation, we can define the pmf as follows: 4/10/2011Lecture 335 Now I can predict the future based on this pmf. Probability does not bother with data. Statistics is all about data.

36 Statistics vs. Probability Are these the correct probabilities? I don’t know. Even if we ran the experiment millions of times, we would be wrong, probably a little wrong, maybe even very wrong. It is always possible to throw 1000 heads in a row even with a fair die, although it is very unlikely that this will happen. In any case, when studying probability we are not concerned with whether the pmf is correct for this experiment, because we do not care about experiments or data. Statisticians, or the people who designed this experiment must take care to design it well, so they can give us a good statistical model. All we know is the statistical model (that is the pmf) and we derive, mathematically, predictions about the future based on this pmf. 4/10/2011Lecture 336


Download ppt "MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY."

Similar presentations


Ads by Google