Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS433 Modeling and Simulation Lecture 03 – Part 01 Probability Review 1 Dr. Anis Koubâa Al-Imam Mohammad Ibn Saud University

Similar presentations


Presentation on theme: "CS433 Modeling and Simulation Lecture 03 – Part 01 Probability Review 1 Dr. Anis Koubâa Al-Imam Mohammad Ibn Saud University"— Presentation transcript:

1 CS433 Modeling and Simulation Lecture 03 – Part 01 Probability Review 1 Dr. Anis Koubâa Al-Imam Mohammad Ibn Saud University http://10.2.230.10:4040/akoubaa/cs433/ 27 Oct 2008

2 2 2 Goals for Today 2  Review the fundamental concepts of probability  Understand the difference between discrete and continuous random variable

3 3 3 Topics 3  Sample Space  Probability Measure  Joint Probability  Conditional Probability  Bayes’ Rule  Law of Total Probability  Random Variable  Probability Distribution  Probability Mass Function (PMF)  Continuous Random Variable  Probability Density Function (PDF)  Cumulative Distribution Function (CDF)  Mean and Variance

4 4 4 Probability Review

5 5 5 Sample space set of all possible outcomes  In probability theory, the sample space or universal sample space or event space, often denoted S, Ω or U (for "universe"), of an experiment or random trial is the set of all possible outcomes.  For example, if the experiment is tossing a coin, the sample space is the set {head, tail}. For tossing a single six-sided die, the sample space is {1, 2, 3, 4, 5, 6}.  In probability theory, an event is a set of outcomes (a subset of the sample space) to which a probability is assigned. Typically, when the sample space is finite, any subset of the sample space is an event (i.e. all elements of the power set of the sample space are defined as events).

6 6 6 Probability measure  Every event (= set of outcomes) is assigned a probability, by a function that we call a probability measure.  The probability of every set is between 0 and 1, inclusive.  The probability of the whole set of outcomes is 1.  If A and B are two events with no common outcomes, then the probability of their union is the sum of their probabilities.  A single card is pulled (out of 52 cards).  Possible Events: having a red card (P=1/2); Having a Jack (P= 1/13);

7 7 7 Joint Probability  The probability that event P will not happen (=event ~P will happen) is 1-prob(P).  Event Union ( U = OR ) Event Intersection ( ∩= AND )  Joint Probability (A ∩ B) : the probability of two events in conjunction. That is, it is the probability of both events together.  Independent Events: Two events A and B are independent if  Example  The probability to have a red card and which is a Jack card  p=p(red) * p(jack) = 1/2 * 1/13

8 8 8 Conditional Probability  Conditional Probability p(A|B) is the probability of some event A, given the occurrence of some other event B.  If A and B are independent, then and  If A and B are independent, the conditional probability of A, given B is simply the individual probability of A alone; same for B given A.  p(A) is the prior probability;  p(A|B) is called a posterior probability.  Once you know B is true, the universe you care about shrinks to B.

9 9 9 Bayes' rule  We know that  Using Conditional Probability definition, we have  The Bayes rule is:

10 10 Law of Total Probability  In probability theory, the law of total probability means the prior probability of A, P(A), is equal to the expected value of the posterior probability of A. That is, for any random variable N, where p(A|N) is the conditional probability of A given N.

11 11 Law of Total Probability  Law of alternatives: The term law of total probability is sometimes taken to mean the law of alternatives, which is a special case of the law of total probability applying to discrete random variables.  if { B n : n = 1, 2, 3,... } is a finite partition of a probability space and each set B n is measurable, then for any event A we have or, alternatively,

12 Five Minutes Break You are free to discuss with your classmates about the previous slides, or to refresh a bit, or to ask questions. Administrative issues Groups Formation Choose a “class coordinator”

13 13 A random variable  Random Variable is also know as stochastic variable.  A random variable is not a variable. It is a function. It maps from the sample space to the real numbers.  random variable is defined as a quantity whose values are random and to which a probability distribution is assigned.  More formally: a random variable is a measurable function from a sample space to the measurable space of possible values of the variable  Example  A coin is tossed ten times. The random variable X is the number of tails that are noted. X can only take the values 0, 1,..., 10, so X is a discrete random variable.

14 14 Probability Distribution  14  The probability distribution of a discrete random variable is a list of probabilities associated with each of its possible values.  It is also sometimes called the probability function or the probability mass function (PMF) for discrete random variable.

15 15 Probability Mass Function (PMF)  Formally  the probability distribution or probability mass function (PMF) of a discrete random variable X is a function that gives the probability p(xi) that the random variable equals xi, for each value xi:  It satisfies the following conditions:

16 16 Continuous Random Variable  A continuous random variable is one which takes an infinite number of possible values.  Continuous random variables are usually measurements.  Examples include height, weight, the amount of sugar in an orange, the time required to run a mile.

17 17 Distribution function aggregates  For the case of continuous variables, we do not want to ask what the probability of "1/6" is, because the answer is always 0...  Rather, we ask what is the probability that the value is in the interval (a,b).  So for continuous variables, we care about the derivative of the distribution function at a point (that's the derivative of an integral). This is called a probability density function (PDF).  The probability that a random variable has a value in a set A is the integral of the p.d.f. over that set A.

18 18 Probability Density Function (PDF)  The Probability Density Function (PDF) of a continuous random variable is a function that can be integrated to obtain the probability that the random variable takes a value in a given interval.  More formally, the probability density function, f(x), of a continuous random variable X is the derivative of the cumulative distribution function F(x) :  Since F(x)=P(X≤x) it follows that

19 19 Cumulative Distribution Function (CDF)  The cumulative distribution function (CDF) is a function giving the probability that the random variable X is less than or equal to x, for every value x.  Formally  the cumulative distribution function F(x) is defined to be:

20 20 Cumulative Distribution Function (CDF)  For a discrete random variable, the cumulative distribution function is found by summing up the probabilities as in the example below.  For a continuous random variable, the cumulative distribution function is the integral of its probability density function f(x).

21 Two Minutes Break You are free to discuss with your classmates about the previous slides, or to refresh a bit, or to ask questions. Administrative issues Groups Formation Choose a “class coordinator”

22 22 Cumulative Distribution Function (CDF)  Example  Discrete case  Discrete case : Suppose a random variable X has the following probability mass function p(xi):  The cumulative distribution function F(x) is then:

23 23 Discrete Distribution Function

24 24 Mean and variance  The expected value (or population mean) of a random variable indicates its average or central value.  The expected value gives a general impression of the behavior of some random variable without giving full details of its probability distribution (if it is discrete) or its probability density function (if it is continuous). Expectation of discrete random variable X Expectation of continuous random variable X

25 25 Mean and variance  Example  When a die is thrown, each of the possible faces 1, 2, 3, 4, 5, 6 (the xi's) has a probability of 1/6 (the p(xi)'s) of showing. The expected value of the face showing is therefore:  µ = E(X) = (1 x 1/6) + (2 x 1/6) + (3 x 1/6) + (4 x 1/6) + (5 x 1/6) + (6 x 1/6) = 3.5  Notice that, in this case, E(X) is 3.5, which is not a possible value of X.

26 26 Mean and variance  The variance is a measure of the 'spread' of a distribution about its average value. Variance is symbolized by V(X) or Var(X) or σ 2.  Whereas the mean is a way to describe the location of a distribution, the variance is a way to capture its scale or degree of being spread out. The unit of variance is the square of the unit of the original variable  The Variance of the random variable X is defined as:  where E(X) is the expected value of the random variable X.  The standard deviation is defined as the square root of the variance, i.e.:


Download ppt "CS433 Modeling and Simulation Lecture 03 – Part 01 Probability Review 1 Dr. Anis Koubâa Al-Imam Mohammad Ibn Saud University"

Similar presentations


Ads by Google