The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Section 4.11: Markov Chains Jiaping Wang Department of Mathematical.

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 5. Continuous Probability Distributions Section 5.6: Normal Distributions Jiaping Wang Department.
. Markov Chains. 2 Dependencies along the genome In previous classes we assumed every letter in a sequence is sampled randomly from some distribution.
Continuous Random Variables Chapter 5 Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 3. Conditional Probability and Independence Section 3.3. Theorem of Total Probability and Bayes’
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 3. Conditional Probability and Independence Section 3.1. Conditional Probability Section 3.2 Independence.
Copyright © Cengage Learning. All rights reserved. 7 Probability.
Chapter 6 Normal Random Variable
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 5. Continuous Probability Distributions Sections 5.4, 5.5: Exponential and Gamma Distributions.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Section 4.2. Expected Values of Random Variables Jiaping.
Jiaping Wang Department of Mathematical Science 02/13/2013, Monday
Chapter 4 Mathematical Expectation.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 8. Some Approximations to Probability Distributions: Limit Theorems More Practical Problems Jiaping.
IERG5300 Tutorial 1 Discrete-time Markov Chain
Operations Research: Applications and Algorithms
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Markov Chains 1.
. Markov Chains as a Learning Tool. 2 Weather: raining today40% rain tomorrow 60% no rain tomorrow not raining today20% rain tomorrow 80% no rain tomorrow.
1 Markov Chains (covered in Sections 1.1, 1.6, 6.3, and 9.4)
. Computational Genomics Lecture 7c Hidden Markov Models (HMMs) © Ydo Wexler & Dan Geiger (Technion) and by Nir Friedman (HU) Modified by Benny Chor (TAU)
G12: Management Science Markov Chains.
10.3 Absorbing Markov Chains
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
QBM117 Business Statistics
Introduction to stochastic process
Probability Distributions
Chapter 2: Probability.
Chapter 5: Probability Concepts
Chapter 4: Probability (Cont.) In this handout: Total probability rule Bayes’ rule Random sampling from finite population Rule of combinations.
1 1 Slide © 2005 Thomson/South-Western Final Exam (listed) for 2008: December 2 Due Day: December 9 (9:00AM) Exam Materials: All the Topics After Mid Term.
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 60 Chapter 8 Markov Processes.
Markov Chains Chapter 16.
Probability (cont.). Assigning Probabilities A probability is a value between 0 and 1 and is written either as a fraction or as a proportion. For the.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Chapter 5 Several Discrete Distributions General Objectives: Discrete random variables are used in many practical applications. These random variables.
: Appendix A: Mathematical Foundations 1 Montri Karnjanadecha ac.th/~montri Principles of.
Chapter 7: The Normal Probability Distribution
CHAPTER SIX FUNCTIONS OF RANDOM VARIABLES SAMPLING DISTRIBUTIONS.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 5. Continuous Probability Distributions Sections 5.2, 5.3: Expected Value of Continuous Random.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Review of Exam 2 Sections 4.6 – 5.6 Jiaping Wang Department of Mathematical Science 04/01/2013, Monday.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Sections 4.3, 4.4. Bernoulli and Binomial Distributions Jiaping.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 2. Foundations of Probability Section 2.3. Definition of Probability Jiaping Wang Department of.
Chapter © 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or.
Free Powerpoint Templates ROHANA BINTI ABDUL HAMID INSTITUT E FOR ENGINEERING MATHEMATICS (IMK) UNIVERSITI MALAYSIA PERLIS MADAM ROHANA BINTI ABDUL HAMID.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 8. Some Approximations to Probability Distributions: Limit Theorems Sections 8.4: The Central Limit.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Sections 4.7, 4.8: Poisson and Hypergeometric Distributions.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Stochastic Models Lecture 2 Poisson Processes
1 Elements of Queuing Theory The queuing model –Core components; –Notation; –Parameters and performance measures –Characteristics; Markov Process –Discrete-time.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Review of Exam I Sections Jiaping Wang Department of Mathematical Science 02/18/2013, Monday.
A discrete-time Markov Chain consists of random variables X n for n = 0, 1, 2, 3, …, where the possible values for each X n are the integers 0, 1, 2, …,
2.1 Introduction In an experiment of chance, outcomes occur randomly. We often summarize the outcome from a random experiment by a simple number. Definition.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Section 4.6: Negative Binomial Distribution Jiaping Wang.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 2. Foundations of Probability Section 2.2. Sample Space and Events Jiaping Wang Department of Mathematical.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Sections 4.9, 4.10: Moment Generating Function and Probability.
EQT 272 PROBABILITY AND STATISTICS
To be presented by Maral Hudaybergenova IENG 513 FALL 2015.
10.1 Properties of Markov Chains In this section, we will study a concept that utilizes a mathematical model that combines probability and matrices to.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Goldstein/Schnieder/Lay: Finite Math & Its Applications, 9e 1 of 60 Chapter 8 Markov Processes.
Chapter 9: Markov Processes
From DeGroot & Schervish. Example Occupied Telephone Lines Suppose that a certain business office has five telephone lines and that any number of these.
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Discrete Probability Distributions
Jiaping Wang Department of Mathematical Science 04/22/2013, Monday
EQT 272 PROBABILITY AND STATISTICS
CHAPTER 2 RANDOM VARIABLES.
Probability Distributions
Probability Distributions
Presentation transcript:

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Section 4.11: Markov Chains Jiaping Wang Department of Mathematical Science 03/18/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Outline Introduction Formal Definition and Formulas

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 1. Introduction

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Consider a system that can be in any of a finite number of states. Assume that the system moves from state to state according to some prescribed probability law. For example, the states represent whether the economy is in a bull market, a bear market or a recession during a given week. From week to week, the states will transit each other, eg., from bull market to bear market, or from recession to bear market with some probabilities. From the figure, if we denote X i as the state in i-th week, S1=Bull market, S2=Bear market, S3=Recession, then we can have P(X i =S1|X i-1 =S2)=0.15=p 21, P(X i =S2|X i-1 =S2)=0.80=p 22, P(X i =S2|X i-1 =S1)=0.075=p 12, P(X i =S1|X i-1 =S1)=0.90 =p 11, P(X i =S3|X i-1 =S1)=0.025 =p 13, P(X i =S3|X i-1 =S2)=0.05 =p 23, P(X i =S1|X i-1 =S3)=0.25 =p 31, P(X i =S2|X i-1 =S3)=0.25 =p 32, P(X i =S3|X i-1 =S3)=0.50 =p 33, Consider a system that can be in any of a finite number of states. Assume that the system moves from state to state according to some prescribed probability law. For example, the states represent whether the economy is in a bull market, a bear market or a recession during a given week. From week to week, the states will transit each other, eg., from bull market to bear market, or from recession to bear market with some probabilities. From the figure, if we denote X i as the state in i-th week, S1=Bull market, S2=Bear market, S3=Recession, then we can have P(X i =S1|X i-1 =S2)=0.15=p 21, P(X i =S2|X i-1 =S2)=0.80=p 22, P(X i =S2|X i-1 =S1)=0.075=p 12, P(X i =S1|X i-1 =S1)=0.90 =p 11, P(X i =S3|X i-1 =S1)=0.025 =p 13, P(X i =S3|X i-1 =S2)=0.05 =p 23, P(X i =S1|X i-1 =S3)=0.25 =p 31, P(X i =S2|X i-1 =S3)=0.25 =p 32, P(X i =S3|X i-1 =S3)=0.50 =p 33, An Example

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Transition Matrix

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 2. Formal Definitions

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Markov Chain

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Denote p (0) =[p 1 (0),p 2 (0),…, p m (0) ] with p k (0) =P(X 0 =S k ) where X 0 denote the starting state of the system. The p (n) =[p 1 (n),…,p m (n) ] is the probability after n steps. So we have p (1) =p (0) P, and p (n) =p (n-1) P Where P is the transition matrix. If P is regular, then there is a limit π=(π 1,…, π m )=lim n  ∞ p (n), which satisfies π= πP. Denote p (0) =[p 1 (0),p 2 (0),…, p m (0) ] with p k (0) =P(X 0 =S k ) where X 0 denote the starting state of the system. The p (n) =[p 1 (n),…,p m (n) ] is the probability after n steps. So we have p (1) =p (0) P, and p (n) =p (n-1) P Where P is the transition matrix. If P is regular, then there is a limit π=(π 1,…, π m )=lim n  ∞ p (n), which satisfies π= πP. Basic Operation: Probability after n steps

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Example 4.29 A supermarket stocks three brands of coffee – A, B and C – and customers switch from brand to brand according to the transition matrix Where S1 corresponds to a purchase of brand A, S2 to brand B, S3 to brand C; that is ¾ of the customers buying brand A also buy brand A the next time they purchase coffee, whereas ¼ of these customers switch to brand B. 1.Find the probability that a customer who buys brand A today will again purchase brand A 2 weeks from today, assuming that he or she purchase coffee once a week. 2.In the long run, what fractions of customers purchase the respective brands? A supermarket stocks three brands of coffee – A, B and C – and customers switch from brand to brand according to the transition matrix Where S1 corresponds to a purchase of brand A, S2 to brand B, S3 to brand C; that is ¾ of the customers buying brand A also buy brand A the next time they purchase coffee, whereas ¼ of these customers switch to brand B. 1.Find the probability that a customer who buys brand A today will again purchase brand A 2 weeks from today, assuming that he or she purchase coffee once a week. 2.In the long run, what fractions of customers purchase the respective brands?

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Solution Answer: 1.Assuming that the customer is chosen at random, his or her transition probabilities are given by P. The given information indicates that P(0)=(1, 0, 0); that is, the customer starts with a purchase of brand A. Then we have p (1) =p (0) P=(3/4, ¼, 0) Gives the probabilities for the next week’s purchase. The probabilities for the two weeks from now are given by p (2) =p (1) P=(9/16, 17/18, 1/12). That is, the chance of the customer to purchase brand A 2 weeks from now is 9/ The answer to the long-run frequency ratio is given by π, which is the stationary distribution. The equation π = πP yields π1=(3/4) π1+(1/4) π2 π2=(1/4) π1+(2/3) π2+(1/4) π3 π3=(1/3) π2+(1/2) π3 π1+ π2+ π3=1 Solve them to π=(2/7, 3/7, 2/7).

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 5. Continuous Probability Distributions Section 5.1: Continuous Random Variables and Their Probability Distributions Jiaping Wang Department of Mathematical Science 03/18/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Outline Introduction Density Function Distribution Function More Examples

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 1. Introduction

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL An Experiment Measuring the life length X of 50 batteries of certain type

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 2. Density Function

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Definition 5.1 A random variable X is said to be continuous if there is a function f(x), called probability density function, such that Notice that P(X=a)=P(a ≤ X ≤ a)=0. A random variable X is said to be continuous if there is a function f(x), called probability density function, such that Notice that P(X=a)=P(a ≤ X ≤ a)=0.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Example 5.1 The random variable X of the life lengths of batteries discussed earlier is associated with a probability density function of the form Find the probability that the life of a particular battery of this type is less than 200 or greater than 400 hours. The random variable X of the life lengths of batteries discussed earlier is associated with a probability density function of the form Find the probability that the life of a particular battery of this type is less than 200 or greater than 400 hours. Answer: Let A be the event that X less than 2 (Hundreds of hours), B be that X is Greater than 4, also A and B are mutually exclusive, so P(AU B)=P(A)+P(B)=P(X≤2)+P(X≥4) =(1-e -1 )+(e -2 )=0.767

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Example 5.2 Refer to Example 5.1. Find the probability that a battery of this type lasts more than 3 (Hundreds of hours), given that it already has been in use for more than 2 (Hundreds of hours). Answer: Let A be the event that X has been in use for more than 2, ie., A={X>2}, let B Denote the event that X lasts more than 3, ie., B={X>3}, so we are interested in The new event C={B|A}={X>3|X>2}, so the probability of P(C) is P(X>3|X>2)=P(X>3, X>2)/P(X>2)=P(X>3)/P(X>2)=e -3/2 /e -1 =0.606

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 3. Distribution Function

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Definition 5.2 The distribution function for a random variable X is defined as F(b)=P(X ≤ b). If X is continuous with probability density function f(x), then Notice that F’(x)=f(x). The distribution function for a random variable X is defined as F(b)=P(X ≤ b). If X is continuous with probability density function f(x), then Notice that F’(x)=f(x). For example, we are given Thus,

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Properties The cumulative distribution function F(x) of the continuous random variable X follows the properties:

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Example 5.3 The distribution function of the random variable X, the time (in months) from the diagnosis age until death for one population of patients with AIDS, is as follows: F(x)=1-e -0.03x 1.2, x>0. 1.Find the probability that a randomly selected person from this population survives at least 12 months. 2.Find the probability density function of X. The distribution function of the random variable X, the time (in months) from the diagnosis age until death for one population of patients with AIDS, is as follows: F(x)=1-e -0.03x 1.2, x>0. 1.Find the probability that a randomly selected person from this population survives at least 12 months. 2.Find the probability density function of X. Answer: 1. P(X ≥ 12) = 1- P(X≤12) = 1-F(12) = 1-(1- e -0.03(12) 1.2 ) = = f(x)=F’(x)=0.036x 0.2 e -0.03x 1.2, x>0; and 0, otherwise.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 3. More Examples

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Additional Example 1 Determine the value of c so that the following function is a density function Answer: Based on the requirement for a density function ∫f(x)dx=1, we have ∫c/(x+1) 3 dx=1  c=1/[∫1/(x+1) 3 dx ]= 2.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Additional Example 2 Let X have the density function If P(X>1)=7/8, find the value of θ. Let X have the density function If P(X>1)=7/8, find the value of θ. Answer: first find F(b)=(b/θ) 3, then we have P(X>1)=1-P(X≤1)=1-F(1)=1-(1/θ) 3 =7/8, So from 1-(1/θ) 3 =7/8, we have θ=2.