Presentation is loading. Please wait.

Presentation is loading. Please wait.

Random Variables ECE460 Spring, 2012.

Similar presentations

Presentation on theme: "Random Variables ECE460 Spring, 2012."— Presentation transcript:

1 Random Variables ECE460 Spring, 2012

2 Combinatorics Notation: Population size Subpopulation size Ordered Sample How many samples of size r can be formed from a population of size n? Sampling with replacement and ordering Sampling without replacement and with ordering

3 How many samples of size r can be formed from a population of size n?
Sampling without replacement and without ordering Sampling with replacement and without ordering

4 Bernoulli Trials Independent trials that result in a success with probability p any failure probability 1-p.

5 Conditional Probabilities
Given two events, E1 & E2, defined on the same probability space with corresponding probabilities P(E1) & P(E2):

6 Example 4.5 An information source produces 0 and 1 with probabilities 0.3 and 0.7, respectively. The output of the source is transmitted via a channel that has a probability of error (turning a 1 into a 0 or a 0 into a 1) of 0.2. What is the probability that at the output a 1 is observed? What is the probability that a 1 was the output of the source if at the output of the channel a 1 is observed?

7 Random Variables Working with Sets has its limitations
Selecting events in Ω Assigning P[ ] Verification of 3 Axioms Would like to leave set theory and move into more advanced & widely used mathematics (i.e., Integration, Derivatives, Limits…) Random Variables: Maps sets in Ω to R Subsets of the real line of the form are called Borel sets. A collection of Borel sets is called a Borel σ-fields. A function that will associate events in Ω with the Borel sets is called a Random Variable.

8 Cumulative Distribution Function (CDF)
CDF of a random variable X is defined as: or Properties:

9 Probability Density Function (PDF)
PDF of a random variable X is defined as: Properties: For discrete random variables, this is known as the Probability Mass Function (PMF)

10 Example 4.6 A coin is flipped three times and the random variable X denotes the total number of heads that show up. The probability of a head in one flip of this coin is denoted by p. What values can the random variable X take? What is the PMF of the random variable X? Derive and plot the CDF of X. What is the probability that X exceeds 1?

11 Uniform Random Variable
This a continuous random variable taking values between a and b with equal probabilities over intervals of equal length.

12 Bernoulli Random Variable
Only two outcomes (e.g., Heads or Tails) Probability Mass Function: Leads to binomial law (sampling w/o replacement & w/o ordering) Example: 10 independent, binary pulses per second arrive at a receiver. The error probability (that is, a zero received as a one or vice versa) is What is the probability of at least one error/second?

13 Binomial Law Example Five missiles are fired against an aircraft carrier in the ocean. It takes at least two direct hits to sink the carrier. All five missiles are on the correct trajectory must get through the “point defense” guns of the carrier. It is known that the point defense guns can destroy a missile with probability P = 0.9. What is the probability that the carrier will still be afloat after the encounter?

14 Uniform Distribution Probability Density Function (pdf) Cumulative distribution function A resistor r is an RV uniform distribution between 900 and 1100 ohms. Find the probability that r is between 950 and 1050 ohms.

15 Gaussian (Normal) Random Variable
A continuous random variable described by the density function: Properties:

16 Special Case: CDF for N(0,1)
The CDF for the normalized Gaussian random variable with m = 0 and σ = 1 is: Pre-Normalized Gaussian: Another related function (complimentary error function) used for finding P(X > x) is: Properties:

17 Gaussian Example A random variable is N(1000; 2500). Find the probability that x is between 900 and 1050.

18 Complimentary Error Function

19 The Central Limit Theorem
Let be a set of random variables with the following properties: The Xk with k = 1, 2, …, n are statistically independent The Xk all have the same probability density function Both the mean and the variance exist for each Xk Let Y be a new random variable defined as The, according to the central limit theorem, the normalized random variable Approaches a Gaussian random variable with zero mean and unit variance as the number of random variables Increases without limit.

20 Functions of Random Variables
Let X be a r.v. with known CDF and PDF. If g(X) is a function of the r.v. X then Example: Given the function where a > 0 and b are constants and X is a r.v. with Find

21 Statistical Averages The expected value of the random variable X is defined as Special Cases: Characteristic Functions:

22 Multiple Random Variables
The joint CDF of X and Y is and its joint PDF is Properties:

23 Expected Values Given g(X,Y) as a function of X and Y, the expected value is Special Cases: Correlation of X & Y Covariance of X & Y Conditional PDF X and Y are statistically independent if:

24 Example Two random variables X and Y are distributed according to
Find the value of the constant K. Find the marginal density functions of X and Y . Are X and Y independent?

25 Example Find

26 Jointly Gaussian R.V.’s Definition: X and Y are jointly Gaussian if where ρ is the correlation coefficient between X and Y. If ρ = 0, then If jointly Gaussian, then the following are also Gaussian

27 n - Jointly Gaussian R.V.’s
Definition: is jointly Gaussian if where and

28 Jointly Gaussian Properties
Any subset of is a vector of jointly Gaussian R.V.’s Jointly Gaussian R.V.’s are completely characterized by Any collection of R.V.’s are uncorrelated iff is diagonal. Also, independence implies their non-correlation. For jointly Gaussian R.V.’s, A collection of uncorrelated R.V.’s, each of which is Gaussian, may not be jointly Gaussian. If X is jointly Gaussian, then is also jointly Gaussian with:

29 Example Let A be a binary random variable that takes the values of +1 and -1 with equal probabilities. Let A and X are statistically independent. Let Y = A X. Find the pdf Find the covariance cov(X,Y). Find the covariance cov(X2,Y2). Are X and Y jointly Gaussian?

Download ppt "Random Variables ECE460 Spring, 2012."

Similar presentations

Ads by Google