LECTURE IV Random Variables and Probability Distributions I.

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Introduction to Probability
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Probability and Statistics Dr. Saeid Moloudzadeh Sample Space and Events 1 Contents Descriptive Statistics Axioms of Probability Combinatorial.
Review of Basic Probability and Statistics
Business and Economics 7th Edition
Introduction to stochastic process
Short review of probabilistic concepts
Background Knowledge Brief Review on Counting,Counting, Probability,Probability, Statistics,Statistics, I. TheoryI. Theory.
Chapter 4 Probability.
Short review of probabilistic concepts Probability theory plays very important role in statistics. This lecture will give the short review of basic concepts.
Class notes for ISE 201 San Jose State University
Visualizing Events Contingency Tables Tree Diagrams Ace Not Ace Total Red Black Total
Data Basics. Data Matrix Many datasets can be represented as a data matrix. Rows corresponding to entities Columns represents attributes. N: size of the.
Probability and Statistics Review
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Chap 4-1 EF 507 QUANTITATIVE METHODS FOR ECONOMICS AND FINANCE FALL 2008 Chapter 4 Probability.
PROBABILITY (6MTCOAE205) Chapter 2 Probability.
© Buddy Freeman, 2015Probability. Segment 2 Outline  Basic Probability  Probability Distributions.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Probability and Statistics Dr. Saeid Moloudzadeh Axioms of Probability/ Basic Theorems 1 Contents Descriptive Statistics Axioms of Probability.
Lecture II.  Using the example from Birenens Chapter 1: Assume we are interested in the game Texas lotto (similar to Florida lotto).  In this game,
Expected Value (Mean), Variance, Independence Transformations of Random Variables Last Time:
Review of Probability.
Review of Probability.
Chapter 9 Introducing Probability - A bridge from Descriptive Statistics to Inferential Statistics.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
© 2003 Prentice-Hall, Inc.Chap 4-1 Business Statistics: A First Course (3 rd Edition) Chapter 4 Basic Probability.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
Chapter 3 Random vectors and their numerical characteristics.
Engineering Probability and Statistics Dr. Leonore Findsen Department of Statistics.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Probability & Statistics I IE 254 Exam I - Reminder  Reminder: Test 1 - June 21 (see syllabus) Chapters 1, 2, Appendix BI  HW Chapter 1 due Monday at.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Chapter 4 Probability ©. Sample Space sample space.S The possible outcomes of a random experiment are called the basic outcomes, and the set of all basic.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 5.2: Recap on Probability Theory Jürgen Sturm Technische Universität.
Computing Fundamentals 2 Lecture 6 Probability Lecturer: Patrick Browne
Appendix : Probability Theory Review Each outcome is a sample point. The collection of sample points is the sample space, S. Sample points can be aggregated.
Dr. Ahmed Abdelwahab Introduction for EE420. Probability Theory Probability theory is rooted in phenomena that can be modeled by an experiment with an.
Basic Concepts of Probability CEE 431/ESS465. Basic Concepts of Probability Sample spaces and events Venn diagram  A Sample space,  Event, A.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Probability Review-1 Probability Review. Probability Review-2 Probability Theory Mathematical description of relationships or occurrences that cannot.
Chapter 12 Probability © 2008 Pearson Addison-Wesley. All rights reserved.
2. Introduction to Probability. What is a Probability?
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.
Chapter 3 Multivariate Random Variables
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
CSE 474 Simulation Modeling | MUSHFIQUR ROUF CSE474:
1 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
STATISTICS 6.0 Conditional Probabilities “Conditional Probabilities”
Probability Theory. Topics Basic Probability Concepts: Sample Spaces and Events, Simple Probability, and Joint Probability, Conditional Probability Bayes’
Statistics for Managers 5th Edition
Basics of Multivariate Probability
Virtual University of Pakistan
Chapter 3 Probability.
The distribution function F(x)
Welcome to the wonderful world of Probability
Chapter 3-2 Discrete Random Variables
Presentation transcript:

LECTURE IV Random Variables and Probability Distributions I

Conditional Probability and Independence In order to define the concept of a conditional probability it is necessary to discuss joint probabilities and marginal probabilities.  A joint probability is the probability of two random events. For example, consider drawing two cards from the deck of cards. There are 52x51=2,652 different combinations of the first two cards from the deck.  The marginal probability is overall probability of a single event or the probability of drawing a given card.

 The conditional probability of an event is the probability of that event given that some other event has occurred.  In the textbook, what is the probability of the die being a one if you know that the face number is odd? (1/3).  However, note that if you know that the role of the die is a one, that the probability of the role being odd is 1.  Axioms of Conditional Probability:  P(A|B) ≥ 0 for any event A.  P(A|B) = 1 for any event A  B.  If {A i  B}, i=1,2,3,… are mutually exclusive, then  If B  H, B  G and P(G)  0 then

 Theorem 2.4.1: for any pair of events A and B such that P(B)  0.  Theorem (Bayes Theorem): Let Events A 1, A 2, … A n be mutually exclusive such that P(A 1  A 2  … A n )=1 and P(A i )>0 for each i. Let E be an arbitrary event such that P(E)>0. Then

 Another manifestation of this theorem is from the joint distribution function:  The bottom equality reduces the marginal probability of event E  This yields a friendlier version of Bayes theorem based on the ratio between the joint and marginal distribution function:

Statistical independence is when the probability of one random variable is independent of the probability of another random variable.  Definition 2.4.1: Events A, B and C are said to be independent if P(A) = P(A|B).  Definition 2.4.2: Events A, B, and C are said to be mutually independent if the following equalities hold:

Basic Concept of Random Variables Definition 1.4.1: A random variable is a function from a sample space S into the real numbers. In this way a random variable is an abstraction

The probability function (or measure) is then defined based on that random variable:

Definition of a Random Variable Definition A random variable is a variable that takes values according to a certain probability. Definition A random variable is a real-valued function defined over a sample space.

Discrete Random Variables Definition A discrete random variable is a variable that takes a countable number of real numbers with certain probability. Definition A bivariate discrete random variable is a variable that takes a countable number of points on the plane with certain probability

In a bivariate distribution, the marginal distribution is the distribution of one variable unconditioned on the outcome of the other variable

Applying Bayes Theorem Definition Discrete random variables are said to be independent if the event (X=x i ), and the event (Y=y j ) are independent for all i,j. That is to say, P(X=x i,Y=y j )= P(X=x i )P(Y=y j ).

Uncorrelated Binomial

Conditional Probabilties

Uncorrelated Discrete Normal

Conditional Probabilities

Correlated Discrete Normal

Conditional Probabilities

Theorem Discrete Random variables X and Y with the probability distribution given in table 3.1 are independent if and only if every row is proportional to any other row, or, equivalently, every column is proportional to any other column.

Multivariate Random Variables Definition A T-variate random variable is a variable that takes a countable number of points on the T-dimensional Euclidean space with certain probabilities.

Univariate Continuous Random Variables Definition If there is a nonnegative function f(x) defined over the whole line such that for any x 1, x 2 satisfying x 1  x 2, then X is a continuous random variable and f(x) is called its density function

By axiom 2, the total area under the density function must equal 1: The simplest example of a continuous random variable is the uniform distribution:

It is obvious that

Definition Let X have density f(x). The conditional density of X given a  X  b, denoted by f(x| a  X  b), is defined by

Definition Let X have the density f(x) and let S be a subset of the real line such that P(X  S)>0. Then the conditional density of X given X  S, denoted by f(x|S), is defined by

Common Univariate Distributions Uniform Distribution

Gamma Distribution

Normal Distribution

Beta Distribution