Presentation is loading. Please wait.

Presentation is loading. Please wait.

Appendix : Probability Theory Review Each outcome is a sample point. The collection of sample points is the sample space, S. Sample points can be aggregated.

Similar presentations


Presentation on theme: "Appendix : Probability Theory Review Each outcome is a sample point. The collection of sample points is the sample space, S. Sample points can be aggregated."— Presentation transcript:

1 Appendix : Probability Theory Review Each outcome is a sample point. The collection of sample points is the sample space, S. Sample points can be aggregated into subsets of the sample space called events A probability measure P

2 If A and B are mutually exclusive If an event A occurs n A time out of n tries

3 Condition probability, P(A|B) The probability of the intersection of events A and B, P(AB) Two events are statically independent if Statistically independent events

4 For independent events A, B, C,… P(ABC,..): the joint probability and P(A) is an example of a marginal probability.

5 A. 2 Densities and Distribution Functions X: A continuous random variables (Real) F(a)=P(X<=a): The cumulative function, Argument ( 引數 ): a

6 If a 1 <a 2 The cumulative distribution function is nondecreasing as a function of a :

7 A probability density function is related to the cumulative distribution function by For a continuous probability density function

8 For a probability density function on a discrete state space

9

10 A. 3 Joint Densities and Distributions The joint distribution function of random variables X 1 and X 2 is For arguments x 1 and x 2 and the joint density function is:

11 The marginal ( 邊際 ) density function is related to the joint density function For random variables (say two) over a discrete state space a summation can be used to compute a marginal density function from the joint density function:

12 The conditional probability density is related to the joint and marginal probability densities through The collection of n random variables X 1, X 2,…X n is said to be independent if the joint probability density function is equal to the product of the marginal probability functions:

13 A. 4 Expectations The expectation of a random variable, X, is: Intuitively, the expectation is an integration (averaging) of x weighted by the probability of x occurring, f(x). It is simply the mean or average value of a random variables, X. For the discrete case:

14 If there are n independent random variables, X 1, X 2, …X n, then

15 The nth moment ( 力矩 )of a probability density is For the discrete case

16 The variance ( 變異數 )of the random variable

17 For the discrete case The variance is related to the 2 nd moment through

18 A. 5 Convolution Suppose Y=X 1 +X 2 where the random variables X 1 and X 2 are independent with marginal densities f 1 (x 1 ) and f 2 (x 2 ). Then the density of f y ( y ), is given by This function is called the convolution of f 1 and f 2.

19 A. 6 Combinations The number of combination of n elements taken m at a time is:

20 A. 7 Some Useful Summations The following summations and closed form expression, taken from this text, are often useful:

21

22 4.8 Useful Moment-generating Function Identities

23


Download ppt "Appendix : Probability Theory Review Each outcome is a sample point. The collection of sample points is the sample space, S. Sample points can be aggregated."

Similar presentations


Ads by Google