Section 10.2 Regular Markov Chains

Slides:



Advertisements
Similar presentations
MARKOV ANALYSIS Andrei Markov, a Russian Mathematician developed the technique to describe the movement of gas in a closed container in 1940 In 1950s,
Advertisements

Solving Quadratic Equations Using the Zero Product Property
4.3 Matrix Approach to Solving Linear Systems 1 Linear systems were solved using substitution and elimination in the two previous section. This section.
Operations Research: Applications and Algorithms
Solving Quadratic Equations using Factoring.  has the form: ax 2 + bx + c = 0 If necessary, we will need to rearrange into this form before we solve!
Chapter 2 Matrices Finite Mathematics & Its Applications, 11/e by Goldstein/Schneider/Siegel Copyright © 2014 Pearson Education, Inc.
1 Markov Chains (covered in Sections 1.1, 1.6, 6.3, and 9.4)
Section 10.1 Basic Properties of Markov Chains
10.3 Absorbing Markov Chains
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Markov Chains Ali Jalali. Basic Definitions Assume s as states and s as happened states. For a 3 state Markov model, we construct a transition matrix.
1. Markov Process 2. States 3. Transition Matrix 4. Stochastic Matrix 5. Distribution Matrix 6. Distribution Matrix for n 7. Interpretation of the Entries.
What is the probability that the great-grandchild of middle class parents will be middle class? Markov chains can be used to answer these types of problems.
Eigenvalues and Eigenvectors (11/17/04) We know that every linear transformation T fixes the 0 vector (in R n, fixes the origin). But are there other subspaces.
Markov Analysis Chapter 16
Goldstein/Schnieder/Lay: Finite Math & Its Applications, 9e 1 of 86 Chapter 2 Matrices.
1 1 Slide © 2005 Thomson/South-Western Final Exam (listed) for 2008: December 2 Due Day: December 9 (9:00AM) Exam Materials: All the Topics After Mid Term.
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 60 Chapter 8 Markov Processes.
1 1 Slide © 2005 Thomson/South-Western Final Exam: December 6 (Te) Due Day: December 12(M) Exam Materials: all the topics after Mid Term Exam.
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 86 Chapter 2 Matrices.
Finding the Inverse of a Matrix
Inverse Matrices (2 x 2) How to find the inverse of a 2x2 matrix.
Section 8.1 – Systems of Linear Equations
Matrices Write and Augmented Matrix of a system of Linear Equations Write the system from the augmented matrix Solve Systems of Linear Equations using.
Section 8.3 – Systems of Linear Equations - Determinants Using Determinants to Solve Systems of Equations A determinant is a value that is obtained from.
Linear Algebra – Linear Equations
8.2 Regular Stochastic Matrices
MATRICES AND DETERMINANTS
Final Exam Review II Chapters 5-7, 9 Objectives and Examples.
How To Find The Reduced Row Echelon Form. Reduced Row Echelon Form A matrix is said to be in reduced row echelon form provided it satisfies the following.
1 Copyright © 2015, 2011, 2007 Pearson Education, Inc. Chapter 2-1 Equations and Inequalities Chapter 2.
1 © 2010 Pearson Education, Inc. All rights reserved © 2010 Pearson Education, Inc. All rights reserved Chapter 9 Matrices and Determinants.
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
Chapter 11 Section 11.4 Solving Larger Systems of Equations.
Solve Equations with Variables on Both Sides
Inverses and Systems Section Warm – up:
CH – 11 Markov analysis Learning objectives:
Chapter 1 Section 1.1 Introduction to Matrices and systems of Linear Equations.
1 1 Slide © 2000 South-Western College Publishing/ITP Slides Prepared by JOHN LOUCKS.
Chapter 2 Systems of Linear Equations and Matrices
Lesson 6.3.  Three friends, Duane, Marsha, and Parker, decide to take their younger siblings to the movies. Before the movie, they buy some snacks at.
Copyright © Cengage Learning. All rights reserved. 7 Linear Systems and Matrices.
Copyright © 2011 Pearson Education, Inc. Solving Linear Systems Using Matrices Section 6.1 Matrices and Determinants.
How To Find The Reduced Row Echelon Form. Reduced Row Echelon Form A matrix is said to be in reduced row echelon form provided it satisfies the following.
Matrices and Systems of Equations
Dear Power point User, This power point will be best viewed as a slideshow. At the top of the page click on slideshow, then click from the beginning.
2.1 – Linear and Quadratic Equations Linear Equations.
§ 6.6 Solving Quadratic Equations by Factoring. Martin-Gay, Beginning and Intermediate Algebra, 4ed 22 Zero Factor Theorem Quadratic Equations Can be.
Solving Quadratic Equations Using the Zero Product Property March 18, 2014.
Warm up. Solving Systems Using Inverse Matrices Systems to Matrices A system of equations in standard form (Ax+By=C) can be written in matrix form [A][X]=[B]
Copyright ©2015 Pearson Education, Inc. All rights reserved.
Section 1.7 Linear Independence and Nonsingular Matrices
Goldstein/Schnieder/Lay: Finite Math & Its Applications, 9e 1 of 60 Chapter 8 Markov Processes.
Chapter 9: Markov Processes
13.3 Product of a Scalar and a Matrix.  In matrix algebra, a real number is often called a.  To multiply a matrix by a scalar, you multiply each entry.
Section 6.2 Solving Linear Equations Math in Our World.
Copyright © Cengage Learning. All rights reserved. 7 Matrices and Determinants.
College Algebra Chapter 6 Matrices and Determinants and Applications
MTH108 Business Math I Lecture 20.
Slides Prepared by JOHN LOUCKS
Finding the Inverse of a Matrix
Copyright 2013, 2010, 2007, 2005, Pearson, Education, Inc.
Chapter 8: Lesson 8.1 Matrices & Systems of Equations
MATRICES AND DETERMINANTS
Use Inverse Matrices to Solve Linear Systems
Multiplicative Inverses of Matrices and Matrix Equations
3.8 Use Inverse Matrices to Solve Linear Systems
Solving Quadratic Equations
Solving a System of Linear Equations
Objective SWBAT solve polynomial equations in factored form.
Presentation transcript:

Section 10.2 Regular Markov Chains Chapter 10 Markov Chains Section 10.2 Regular Markov Chains

Short-Term Predictions Recall from the previous section that short-term predictions can be made for n repetitions of an experiment using the initial probability vector and the transition matrix.

KickKola Example Also from the previous section, we found that KickKola soda’s market share increased after each of two following purchases from their original market share of 14%. (See Section 10.1, Example 2 notes) What will happen to their market share after a large number of repetitions? The answer to this question requires us to make a long-range prediction and find the level at which the trend will stabilize.

Long-Range Predictions In this section, we try to decide what will happen to the initial probability vector in the long run – that is, as n gets larger and larger. Assumption: transition matrix probabilities remain constant from repetition to repetition.

Regular Transition Matrices While one of the many applications of Markov chains is in finding long-range predictions, it’s not possible to make long-range predictions with all transition matrices. Long-range predictions are always possible with regular transition matrices.

Regular Transition Matrices A transition matrix is regular if some power of the matrix contains all positive entries. A Markov chain is a regular Markov chain if its transition matrix is regular.

Determining if Transition Matrices are Regular If some power of the matrix has all positive, non-zero entries, then the matrix is regular. If a transition matrix has some zero entries and if these zeros occur in the identical places in both P and P for any k, then the zeros will appear in those places for all higher powers of P. Therefore, P is not regular. k+1 k

Example 1 Decide whether the following transition matrices are regular. a.) b.)

Equilibrium Vectors

Equilibrium Vector If a Markov chain with transition matrix P is regular, then there exists a probability vector V such that VP = V This vector V gives the long-range trend of the Markov chain. Vector V is found by solving a system of linear equations. If there are two states, then V = [ x y ] If there are three states, then V = [ x y z ]

Example 2 Find the equilibrium vector for the transition matrices below. a.) b.)

Properties of a Regular Markov Chain

Steps for Making a Long-Range Prediction 1.) Create the transition matrix P. 2.) Determine if the chain is regular. (This occurs when the transition matrix is regular.) 3.) Create the equilibrium matrix V. 4.) Find and simplify the system of equations described by VP = V. 5.) Discard any redundant equations and include the equation x + y = 1 related to V = [x y]. (Or x+y+z=1) 6.) Solve the resulting system. 7.) Check your work by verifying that VP = V.

Example 3 - KickKola Revisited A marketing analysis shows that 12% of the consumers who do not currently drink KickKola will purchase KickKola the next time they buy a cola and that 63% of the consumers who currently drink KickKola will purchase it the next time they buy a cola. Make a long-range prediction of KickKola’s ultimate market share, assuming that current trends continue.

Example 4 A census report shows that currently 32% of the residents of Metropolis own their home and that 68% rent. Of those that rent, 12% plan to buy a home in the next 12 months, while 3% of the homeowners plan to sell their home and rent instead. Make a long-range prediction of the percent of Metropolis residents who will own their home and the percent who will rent, assuming current trends hold.