Presentation is loading. Please wait.

Presentation is loading. Please wait.

Random Variables and Stochastic Processes –

Similar presentations


Presentation on theme: "Random Variables and Stochastic Processes –"— Presentation transcript:

1 Random Variables and Stochastic Processes – 0903720
Dr. Ghazi Al Sukkar Office Hours: will be posted soon Course Website:

2 Most common used RV.s Continuous-Type: Discrete-Type: Gaussian
Log-Normal Exponential Gamma Erlang Chi-square Reyleigh Nakagami-m Uniform Discrete-Type: Bernoulli Binomial Poisson Geometric Negative Binomial Discrete Uniform

3 Gaussian (or Normal) Random Variable :
𝑓 𝑋 𝑥 = 1 2𝜋 𝜎 2 𝑒 − 𝑥−𝜇 𝜎 2 This is a bell shaped curve, symmetric around the parameter 𝜇 and its distribution function is given by 𝐹 𝑋 𝑥 = −∞ 𝑥 𝜋 𝜎 2 𝑒 − 𝑦−𝜇 𝜎 2 𝑑𝑦≜𝐺 𝑥−𝜇 𝜎 𝐺 𝑥 = −∞ 𝑥 𝜋 𝑒 − 𝑦 𝑑𝑦 (Tabulated) 𝑄 𝑥 = 𝑥 ∞ 𝜋 𝑒 − 𝑦 𝑑𝑦 =1−𝐺 𝑥 = 1 2 𝑒𝑟𝑓𝑐 𝑥 2 Since 𝑓 𝑋 (𝑥) depends on two parameters 𝜇 and 𝜎 2 the notation 𝑋N(μ, 𝜎 2 ) is used to denote a Gaussian RV.

4 𝑋N(0,1): Standard Normal RV: zero mean and Unity variance.
Most important and frequently encountered random variable in communications. Large 𝜎 2 Small 𝜎 2 𝜇 𝜇

5 Log-normal Distribution
If 𝑌 is a random variable with a normal distribution, then 𝑋= 𝑒 𝑌 has a log-normal distribution. Likewise if 𝑋 is log-normal distribution, then l𝑛 𝑋 is normal distribution. Denoted as 𝑙𝑛 𝒩(𝜇, 𝜎 2 ) 𝑓 𝑋 𝑥 = 1 𝑥 2𝜋 𝜎 2 𝑒 − ln 𝑥 −𝜇 𝜎 2 , 𝑥>0. 𝐹 𝑋 𝑥 =𝐺 ln 𝑥 −𝜇 𝜎

6 Exponential distribution
The exponential distribution represents the probability distribution of the time intervals between successive Poisson arrivals. 𝑋 is exponential if: 𝑓 𝑋 𝑥 = 𝜆 𝑒 −𝜆𝑥 , 𝑥≥0 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 𝐹 𝑋 𝑥 =1− 𝑒 −𝜆𝑥 𝑋~exp⁡(𝜆)

7

8 The Memoryless property of Exponential Distribution
The exponential distribution is without memory. The exponential distributions is the unique continuous memoryless distributions. Let 𝑥, 𝑠≥0 𝑃 𝑋>𝑠+𝑥|𝑋>𝑠 = 𝑃 𝑋>𝑠+𝑥 ∩ 𝑋>𝑠 𝑃 𝑋>𝑠 = 𝑃 𝑋>𝑠+𝑥 𝑃 𝑋>𝑠 = 𝑒 −𝜆 𝑠+𝑥 𝑒 −𝜆𝑠 = 𝑒 −𝜆𝑥 =𝑃 𝑋>𝑥 Let 𝑋 represents the lifetime of an equipment, if the equipment has been working for time 𝑠, then the probability it will survive an additional time 𝑥 depends only on 𝑥, and is identical to the probability of survival for time 𝑥 of a new equipment.

9 Example The amount of waiting time a customer spends at a restaurant has an exponential distribution with a mean value of 5 minutes. The probability that a customer will spend more than 10 minutes in the restaurant is: 𝑃 𝑋>10 = 𝑒 −10/5 = 𝑒 −2 =0.1353 The probability that the customer will spend an additional 10 minutes in the restaurant given that he has been there for more than 10 minutes is: 𝑃 𝑋>10+10|𝑋>10 =𝑃 𝑋>10 = 𝑒 −2 =0.1353

10 Gamma (Erlang) Distribution
Denoted by 𝐺 𝛼,𝛽 , 𝛼,𝛽>0. 𝑓 𝑋 𝑥 = 𝑥 𝛼−1 Γ(𝛼) 𝛽 𝛼 𝑒 − 𝑥 𝛽 , 𝑥≥0 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 Γ 𝛼 = 0 ∞ 𝑥 𝛼−1 𝑒 −𝑥 𝑑𝑥 is the Gamma function Γ 𝑛 = 𝑛−1 Γ 𝑛−1 = 𝑛−1 !, for 𝑛 integer.

11 𝑘≡𝛼, 𝜃≡𝛽

12 Erlang Distribution Erlang Distribution is a special case of Gamma distribution where the shape parameter 𝛼 is an integer. It is 𝐺(𝑛, 1 𝑛𝜇 ) Let 𝛼=𝑛, 𝛽= 1 𝜆 𝑓 𝑋 𝑥 = 𝜆 𝑛 𝑥 𝑛−1 𝑛−1 ! 𝑒 −𝜆𝑥 , 𝑥≥0 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 𝐹 𝑋 𝑥 =1− 𝑘=0 𝑛−1 𝜆𝑥 𝑘 𝑘! 𝑒 −𝜆𝑥 Put 𝜆=𝑛𝜇. Application: The number of telephone calls which might be made at the same time to a switching center.

13 CHI-Square Distribution
It is a special case of Gamma distribution when 𝛼= 𝑛 2 , 𝑛 𝑖𝑠 𝑖𝑛𝑡𝑒𝑔𝑒𝑟, and 𝛽=2 ⟹𝐺( 𝑛 2 ,2)≡ 𝜒 2 (𝑛) Chi-square with 𝑛 degree of freedom. 𝑓 𝑋 𝑥 = 𝑥 𝑛 2 −1 Γ( 𝑛 2 ) 2 𝑛 2 𝑒 − 𝑥 2 , 𝑥≥0 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 If 𝑛=2: will obtain an exponential distribution.

14 𝑘≡𝑛

15 Rayleigh Distribution
𝑋 is Rayleigh distribution with parameter 𝜎 2 . 𝑋= 𝑋 𝑋 2 2 , 𝑋 𝑖 ~(0, 𝜎 2 ). 𝑋 1 , 𝑋 2 𝑎𝑟𝑒 𝑠𝑡𝑎𝑡𝑖𝑠𝑡𝑖𝑐𝑎𝑙𝑙𝑦 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑓 𝑋 𝑥 = 𝑥 𝜎 2 𝑒 − 𝑥 2 2 𝜎 2 , 𝑥≥0 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 , 𝐹 𝑋 𝑥 =1− 𝑒 − 𝑥 2 2 𝜎 2 Application: used to model attenuation of wireless signals facing multi-path fading.

16 Nakagami-m Distribution
A generalization of Rayleigh distribution through a parameter 𝑚. 𝑓 𝑋 𝑥 = 2 Γ(𝑚) 𝑚 Ω 𝑥 2𝑚−1 𝑒 −𝑚 𝑥 2 /Ω , 𝑥>0 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 Put 𝑚=1⇒ Rayleigh distribution Application: gives greater flexibility to model randomly fluctuating channels in wireless communication theory.

17 𝜇≡𝑚, 𝜔≡Ω

18 Uniform Random Variable
A continuous random variable that takes values between 𝑎 and 𝑏 with equal probabilities over intervals of equal length 𝑋~𝑈(𝑎,𝑏). 𝑓 𝑋 𝑥 = 1 𝑏−𝑎 , 𝑎≤𝑥≤𝑏 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 𝐹 𝑋 𝑥 = 1, 𝑥≥𝑏 𝑥−𝑎 𝑏−𝑎 , 𝑎≤𝑥≤𝑏 0, 𝑥<𝑎 The phase of a received sinusoidal carrier is usually modeled as a uniform random variable between 0 and 2𝜋. Quantization error is also typically modeled as uniform.

19 Discrete random variables
Bernoulli Binomial Poisson Geometric Negative Binomial Discrete Uniform

20 Bernoulli Random Variable
Simplest possible random experiment Two possibilities: Accept/failure Male/female Rain/not rain One of the possibilities mapped to 1, 𝑋(𝑓𝑎𝑖𝑙𝑢𝑟𝑒)=0, 𝑋(𝑎𝑐𝑐𝑒𝑝𝑡)=1. 𝑃 𝑎𝑐𝑐𝑒𝑝𝑡 =𝑃 𝑋 = 1 =𝑝, 𝑃 𝑓𝑎𝑖𝑙𝑢𝑟𝑒 =𝑃 𝑋=0 =1−𝑝=𝑞 Good model for a binary data source whose output is 1 or 0. Can also be used to model the channel errors.

21

22 Binomial Random Variable
𝑌 is a discrete random variable that gives the number of 1’s in a sequence of n independent Bernoulli trials. 𝑌= 𝑖=1 𝑛 𝑋 𝑖 , 𝑋 𝑖 ,𝑖=1,2,..,𝑛 are statistically independent and Identically distributed (iid) Bernoulli RV.s 𝑃 𝑛 𝑘 = 𝑛 𝑘 𝑝 𝑘 1−𝑝 𝑛−𝑘 =𝑃 𝑌=𝑘 𝑓 𝑌 𝑦 = 𝑘=0 𝑛 𝑛 𝑘 𝑝 𝑘 1−𝑝 𝑛−𝑘 𝛿 𝑦−𝑘 𝐹 𝑌 𝑦 = 𝑘=0 𝑦 𝑛 𝑘 𝑝 𝑘 1−𝑝 𝑛−𝑘 𝐹 𝑋 (𝑥) 1 2 4 6

23 Poisson Probability Mass Function
Assume: 1. The number of events occurring in a small time interval ∆𝑡 is 𝜆 ′ ∆𝑡 as ∆𝑡→0. 2. The number of events occurring in non overlapping time intervals are independent. Then the number of events in a time interval 𝑇 have a Poisson Probability Mass Function of the form: 𝑓 𝑋 𝑥 = 𝑘=0 ∞ 𝑃(𝑋=𝑘)𝛿 𝑥−𝑘 𝑃 𝑋=𝑘 = 𝜆 𝑘 𝑘! 𝑒 −𝜆 , 𝑘=0,1,2,… Where 𝜆= 𝜆 ′ 𝑇. Application: The number of phone calls at a call center per minute. The number of time a web server is accessed per minute.

24 Geometric distribution
How many items produced to get one passing the quality control Number of days to get rain Sequence of failures until the first success - sequence of Bernoulli trials Possible values: If we count all trials 1,2,…,∞ If we only count the failures 0,1,2,…,∞

25 Derivation of probability density function - counting all trials
Let 𝑋 be the number of trials needed to the first success in repeated Bernoulli trials. Let's look at the sequence FFFFS with probability 1−𝑝 4 𝑝 A general sequence will be like FFF…FFS The probability of having 𝑘−1 failures before the first success is 𝑃 𝑘 𝑡𝑟𝑖𝑎𝑙𝑠 =𝑃 𝑋=𝑘 =𝑃 𝑘−1 𝑓𝑎𝑖𝑙𝑢𝑟𝑒 𝑡ℎ𝑒𝑛 𝑎 𝑠𝑢𝑐𝑐𝑒𝑠𝑠 𝑘−1 𝑓𝑎𝑖𝑙𝑢𝑟𝑒 𝑡ℎ𝑒𝑛 𝑎 𝑠𝑢𝑐𝑐𝑒𝑠𝑠 = 1−𝑝 𝑘−1 𝑝, 𝑘=1,2,…,∞ The cumulative distribution can be found to be 𝐹 𝑋 𝑥 =𝑃 𝑋≤𝑥 = 𝑘=1 𝑥 𝑃 𝑋=𝑘 =1− 1−𝑝 𝑥 ⟹𝑃 𝑋>𝑥 = 1−𝑝 𝑥

26 The memoryless property
What will happen to the distribution knowing that 𝑛 failures already occurred? That is we have been waiting for an empty cab and have experienced 7 occupied Formally 𝑃 𝑋>𝑛+𝑥|𝑋>𝑛 = 𝑃 𝑋>𝑛+𝑥 ∩ 𝑋>𝑛 𝑃 𝑋>𝑛 = 𝑃 𝑋>𝑛+𝑥 𝑃 𝑋>𝑛 = 1−𝑝 𝑥+𝑛 1−𝑝 𝑛 = 1−𝑝 𝑥 =𝑃 𝑋>𝑥 That is, the probability of exceeding 𝑛+𝑥 having reached 𝑛 is the same as the property of exceeding 𝑥 starting from the beginning. In other words no aging. Given that the first 𝑛 trials had no success, the conditional probability that the first success will appear after an additional 𝑥 trials depends only on 𝑥 and not on 𝑛 (not on the past).

27 Negative Binomial Distribution
Let 𝑌 be the number of Bernoulli trials required to realize 𝑟 success. 𝑃 𝑌=𝑘 =𝑃 𝑟−1 𝑠𝑢𝑐𝑐𝑒𝑠𝑠𝑒𝑠 𝑖𝑛 𝑘−1 𝑡𝑟𝑖𝑎𝑙𝑠 𝑎𝑛𝑑 𝑠𝑢𝑐𝑐𝑒𝑠𝑠 𝑎𝑡 𝑡ℎ𝑒 𝑘𝑡ℎ 𝑡𝑟𝑖𝑎𝑙 = 𝑘−1 𝑟−1 𝑝 𝑟−1 𝑞 𝑘−𝑟 𝑝 = 𝑘−1 𝑟−1 𝑝 𝑟 𝑞 𝑘−𝑟 , 𝑘=𝑟,𝑟+1,… 𝑞=1−𝑝 If 𝑛 or fewer trials are needed for 𝑟 successes, then the number of successes in 𝑛 trials must be at least 𝑟: 𝑃 𝑌≤𝑛 =𝑃 𝑋≥𝑟 𝑌~NB(r,p): Negative Binomial RV. 𝑋: Binomial RV.

28 Let 𝑍=𝑌−𝑟: the number of failures preceding the 𝑟𝑡ℎ success.
𝑃 𝑍=𝑘 =𝑃 𝑌=𝑘+𝑟 = 𝑟+𝑘−1 𝑟−1 𝑝 𝑟 𝑞 𝑘 = 𝑟+𝑘−1 𝑘 𝑝 𝑟 𝑞 𝑘 , k=0,1,2,…

29 Uniform Probability Mass Function
𝑝 𝑖 =𝑃 𝑋= 𝑥 𝑖 = 1 𝑛 , 𝑖=1,2,…,𝑛 𝑓 𝑋 𝑥 = 𝑖=1 𝑛 1 𝑛 𝛿 𝑥− 𝑥 𝑖 𝑓 𝑋 (𝑥) 1 𝑛 𝑥 𝑥 1 𝑥 2 𝑥 3 𝑥 𝑛


Download ppt "Random Variables and Stochastic Processes –"

Similar presentations


Ads by Google