Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

CS433: Modeling and Simulation
Random Variables ECE460 Spring, 2012.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Michael Maurizi Instructor Longin Jan Latecki C9:
Multivariate Distributions
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
DEPARTMENT OF HEALTH SCIENCE AND TECHNOLOGY STOCHASTIC SIGNALS AND PROCESSES Lecture 1 WELCOME.
Review of Basic Probability and Statistics
Statistics Lecture 18. Will begin Chapter 5 today.
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
Probability Distributions Finite Random Variables.
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Probability Distributions Random Variables: Finite and Continuous Distribution Functions Expected value April 3 – 10, 2003.
Chapter 4: Joint and Conditional Distributions
Random Variable and Probability Distribution
Lecture II-2: Probability Review
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Joint Distribution of two or More Random Variables
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Jointly Distributed Random Variables
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
Chapter 14 Monte Carlo Simulation Introduction Find several parameters Parameter follow the specific probability distribution Generate parameter.
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
One Random Variable Random Process.
1 Topic 5 - Joint distributions and the CLT Joint distributions –Calculation of probabilities, mean and variance –Expectations of functions based on joint.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Probability Refresher. Events Events as possible outcomes of an experiment Events define the sample space (discrete or continuous) – Single throw of a.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Generalized Linear Models (GLMs) and Their Applications.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Topic 5 - Joint distributions and the CLT
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Chapter 20 Statistical Considerations Lecture Slides The McGraw-Hill Companies © 2012.
MULTIPLE RANDOM VARIABLES A vector random variable X is a function that assigns a vector of real numbers to each outcome of a random experiment. e.g. Random.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
F Y (y) = F (+ , y) = = P{Y  y} 3.2 Marginal distribution F X (x) = F (x, +  ) = = P{X  x} Marginal distribution function for bivariate Define –P57.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
Lesson 99 - Continuous Random Variables HL Math - Santowski.
Random Variables By: 1.
Statistics Lecture 19.
UNIT-2 Multiple Random Variable
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
ASV Chapters 1 - Sample Spaces and Probabilities
M248: Analyzing data Block A UNIT A3 Modeling Variation.
CS723 - Probability and Stochastic Processes
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Chapter 2. Random Variables
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Introduction to Probability: Solutions for Quizzes 4 and 5
1/2555 สมศักดิ์ ศิวดำรงพงศ์
Presentation transcript:

Pairs of Random Variables Random Process

Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation” between two random variables  Conditional probabilities of a pair of random variables

Two Random Variables  The mapping is written as to each outcome is S

Example 1

Example 2

Two Random Variables  The events evolving a pair of random variables (X, Y) can be represented by regions in the plane

Two Random Variables  To determine the probability that the pair is in some region B in the plane, we have  Thus, the probability is  The joint pmf, cdf, and pdf provide approaches to specifying the probability law that governs the behavior of the pair (X, Y)  Firstly, we have to determine what we call product form where A k is one-dimensional event

Two Random Variables  The probability of product-form events is  Some two-dimensional product-form events are shown below

Pairs of Discrete Random Variables  Let the vector random variable assume values from some countable set  The joint pmf of X specifies the probabilities of event  The values of the pmf on the set S X,Y provide

Pairs of Discrete Random Variables

 The probability of any event B is the sum of the pmf over the outcomes in B  When the event B is the entire sample space S X,Y, we have

Marginal Probability Mass Function  The joint pmf provides the information about the joint behavior of X and Y  The marginal probability mass function shows the random variables in isolation similarly

Example 3

The Joint Cdf of X and Y  The joint cumulative distribution function of X and Y is defined as the probability of the event  The properties are

The Joint Cdf of X and Y

Example 4

The Joint Pdf of Two Continuous Random Variables  Generally, the probability of events in any shape can be approximated by rctangles of infinitesimal width that leads to integral operation  Random variables X and Y are jointly continuous if the probability of events involving (X, Y) can be expressed as an integral of probability density function  The joint probability density function is given by

The Joint Pdf of Two Continuous Random Variables

 The joint cdf can be obtained by using this equation  It follows  The probability of rectangular region is obtained by letting

The Joint Pdf of Two Continuous Random Variables  We can, then, prove that the probability of an infinitesimal rectangle is  The marginal pdf’s can be obtained by

The Joint Pdf of Two Continuous Random Variables

Example 5

Example 6

Independence of Two Random Variables  X and Y are independent random variable if any event A 1 defined in terms of X is independent of any event A 2 defined in terms of Y  If X and Y are independent discrete random variables, then the joint pmf is equal to the product of the marginal pmf’s

Independence of Two Random Variables  If the joint pmf of X and Y equals the product of the marginal pmf’s, then X and Y are independent  Discrete random variables X and Y are independent iff the joint pmf is equal to the product of the marginal pmf’s for all x j, y k

Independence of Two Random Variables  In general, the random variables X and Y are independent iff their joint cdf is equal to the product of its marginal cdf’s  In continuous case, X and Y are independent iff their joint pdf’s is equal to the product of the marginal pdf’s

Joint Moments and Expected Values  The expected value of is given by  Sum of random variable

Joint Moments and Expected Values  In general, the expected value of a sum of n random variables is equal to the sum of the expected values  Suppose that, we can get

Joint Moments and Expected Values  The jk-th joint moment of X and Y is given by  When j = 1 and k = 1, we can say that as correlation of X and Y  And when E[XY] = 0, then we say that X and Y are orthogonal

Conditional Probability Case 1: X is a Discrete Random Variable  For X and Y discrete random variables, the conditional pmf of Y given X = x is given by  The probability of an event A given X = x k is found by using  If X and Y are independent, we have

Conditional Probability  The joint pmf can be expressed as the product of a conditional pmf and marginal pmf  The probability that Y is in A can be given by

Conditional Probability  Example:

Conditional Probability  Suppose Y is a continuous random variable, the conditional cdf of Y given X = x k is  We, therefore, can get the conditional pdf of Y given X = x k  If X and Y are independent, then  The probability of event A given X = x k is obtained by

Conditional Probability  Example: binary communications system

Conditional Probability Case 2: X is a continuous random variable  If X is a continuous random variable then P[X = x] = 0  If X and Y have a joint pdf that is continuous and nonzero over some region of the plane, we have conditional cdf of Y given X = x

Conditional Probability  The conditional pdf of Y given X = x is  The probability of event A given X = x is obtained by  If X and Y are independent, then and  The probability Y in A is

Conditional Probability  Example

Conditional Expectation  The conditional expectation of Y given X = x is given by  When X and Y are both discrete random variables

Conditional Expectation  In particular we have where

Pairs of Jointly Gaussian Random Variables  The random variables X and Y are said to be jointly Gaussian if their joint pdf has form

Lab assignment  In group of 2 (for international class: do it personally), refer to Garcia’s book, example 5.49, page 285  Run the program in MATLAB and analyze the result  Your analysis should contain:  The purpose of the program  Line by line explanation of the program (do not copy from the book, remember NO PLAGIARISM is allowed)  The explanation of Fig and 5.29  The relationship between the purpose of the program and the content of chaper 5 (i.e. It answers the question: why do we study Gaussian distribution in this chapter?)  Try using different parameter’s values, such as 100 observation, observation, etc and analyze it  Due date: next week

Regular Class: NEXT WEEK: QUIZ 1 Material: Chapter 1 to 5, Garcia’s book Duration: max 1 hour