10.3 Absorbing Markov Chains

Slides:



Advertisements
Similar presentations
4.1 Introduction to Matrices
Advertisements

Rules of Matrix Arithmetic
Chapter 4 Systems of Linear Equations; Matrices
Chapter 4 Systems of Linear Equations; Matrices
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Section 4.11: Markov Chains Jiaping Wang Department of Mathematical.
Copyright (c) 2003 Brooks/Cole, a division of Thomson Learning, Inc
Operations Research: Applications and Algorithms
Chapter 6 Linear Programming: The Simplex Method Section 3 The Dual Problem: Minimization with Problem Constraints of the Form ≥
Topics Review of DTMC Classification of states Economic analysis
1 Markov Analysis In an industry with 3 firms we could look at the market share of each firm at any time and the shares have to add up to 100%. If we had.
Solutions Markov Chains 1
Chapter 17 Markov Chains.
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
To accompany Quantitative Analysis for Management, 9e by Render/Stair/Hanna 16-1 © 2006 by Prentice Hall, Inc. Upper Saddle River, NJ Chapter 16.
Section 2.3 Gauss-Jordan Method for General Systems of Equations
1 1 Slide © 2005 Thomson/South-Western Final Exam (listed) for 2008: December 2 Due Day: December 9 (9:00AM) Exam Materials: All the Topics After Mid Term.
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 60 Chapter 8 Markov Processes.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete time Markov chains (Sec )
Chapter 1 Systems of Linear Equations
Copyright © Cengage Learning. All rights reserved.
1 1 Slide © 2005 Thomson/South-Western Final Exam: December 6 (Te) Due Day: December 12(M) Exam Materials: all the topics after Mid Term Exam.
Basic Definitions Positive Matrix: 5.Non-negative Matrix:
Chapter 1 Section 1.2 Echelon Form and Gauss-Jordan Elimination.
Chapter 4 Systems of Linear Equations; Matrices
1.3 Matrices and Matrix Operations.
Section 10.2 Regular Markov Chains
Final Exam Review II Chapters 5-7, 9 Objectives and Examples.
Abstract Sources Results and Conclusions Jessica Porath  Department of Mathematics  University of Wisconsin-Eau Claire Faculty Mentor: Dr. Don Reynolds.
Infinite Series Objective: We will try to find the sum of a series with infinitely many terms.
Copyright © 2013, 2009, 2005 Pearson Education, Inc. 1 5 Systems and Matrices Copyright © 2013, 2009, 2005 Pearson Education, Inc.
1 1 Slide © 2000 South-Western College Publishing/ITP Slides Prepared by JOHN LOUCKS.
1.3 Matrices and Matrix Operations. Definition A matrix is a rectangular array of numbers. The numbers in the array are called the entries in the matrix.
Solutions Markov Chains 2 3) Given the following one-step transition matrix of a Markov chain, determine the classes of the Markov chain and whether they.
Chapter 6 Linear Programming: The Simplex Method Section 3 The Dual Problem: Minimization with Problem Constraints of the Form ≥
Barnett/Ziegler/Byleen Finite Mathematics 11e1 Learning Objectives for Section 6.3 The student will be able to formulate the dual problem. The student.
Infinite Series Objective: We will try to find the sum of a series with infinitely many terms.
4.5 Inverse of a Square Matrix
Learning Objectives for Section 4.5 Inverse of a Square Matrix
Markov Chains and Absorbing States
Copyright © Cengage Learning. All rights reserved. 2 SYSTEMS OF LINEAR EQUATIONS AND MATRICES.
Markov Chains Part 4. The Story so far … Def: Markov Chain: collection of states together with a matrix of probabilities called transition matrix (p ij.
SECTION 9 Orbits, Cycles, and the Alternating Groups Given a set A, a relation in A is defined by : For a, b  A, let a  b if and only if b =  n (a)
Markov Chains Part 3. Sample Problems Do problems 2, 3, 7, 8, 11 of the posted notes on Markov Chains.
Section 1.7 Linear Independence and Nonsingular Matrices
10.1 Properties of Markov Chains In this section, we will study a concept that utilizes a mathematical model that combines probability and matrices to.
Copyright © Cengage Learning. All rights reserved. CHAPTER 8 RELATIONS.
Meaning of Markov Chain Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only.
Asst. Professor in Mathematics
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Relations and Functions ORDERED PAIRS AND CARTESIAN PRODUCT An ordered pair consists of two elements, say a and b, in which one of them, say a is designated.
Copyright © Cengage Learning. All rights reserved. 8 Matrices and Determinants.
Goldstein/Schnieder/Lay: Finite Math & Its Applications, 9e 1 of 60 Chapter 8 Markov Processes.
Linear Algebra Engineering Mathematics-I. Linear Systems in Two Unknowns Engineering Mathematics-I.
Warm-up Use the information below to find the population distribution after 20 years for the given population of giraffes. Age (in years)
Module Code MA1032N: Logic Lecture for Week Autumn.
College Algebra Chapter 6 Matrices and Determinants and Applications
Slides Prepared by JOHN LOUCKS
Discrete-time markov chain (continuation)
5 Systems of Linear Equations and Matrices
Matrix Multiplication
Chapter 1 Systems of Linear Equations and Matrices
Chapter 5 Markov Analysis
Markov Chains Part 5.
Inverse & Identity MATRICES Last Updated: October 12, 2005.
3.8 Use Inverse Matrices to Solve Linear Systems
Slides by John Loucks St. Edward’s University.
Chapter 4 Systems of Linear Equations; Matrices
Discrete-time markov chain (continuation)
Discrete-time markov chain (continuation)
Presentation transcript:

10.3 Absorbing Markov Chains In this section, the concepts of absorbing Markov chains will be discussed and illustrated through examples. We will see that the powers of the transition matrix for an absorbing Markov chain will approach a limiting matrix. We must introduce some terminology first.

Absorbing state and Absorbing Chains A state in a Markov chain is called an absorbing state if once the state is entered, it is impossible to leave. This concept will be illustrated by an example: For the following transition matrix, we determine that B is an absorbing state since the probability from going from state B to state B is one (this means that once the system enters state B it does not leave since the probability of moving from state B to states A or C is zero as indicated by the second row.

Theorem 1:Absorbing states and transition matrices A state in a Markov chain is absorbing if and only if the row of the transition matrix corresponding to the state has a 1 on the main diagonal and zeros elsewhere. To ensure that the transition matrices for Markov chains with one or more absorbing states have limiting matrices it is necessary that the chain satisfies the following definition: A Markov chain is an absorbing chain if: A) there is at least one absorbing state. B) It is possible to go from each non-absorbing state to at least one absorbing state in a finite number of steps.

Recognizing Absorbing Markov chains We will use a transition diagram to determine whether P is the transition matrix for an absorbing Markov chain.

We see that B is an absorbing state

In a transition matrix, if all the absorbing states precede all the non-absorbing states, the matrix is said to be in standard form. Standard forms are very useful in determining limiting matrices for absorbing Markov chains. The matrix below is in standard form since the absorbing states A and B precede the non-absorbing state C. The general standard form matrix P is listed on the right in which the matrix P is partitioned into four sub-matrices I, O , R and Q where I is an identity matrix and O is the zero matrix.

Limiting Matrix We are now ready to discuss the long-run behavior of absorbing Markov chains. This will be illustrated with an application. Two competing real estate companies are trying to buy all the farms in a particular area for future housing development. Each year, 10% of the farmers sell to company A each year, 40% sell to company B and the remainder continue farming. Neither company ever sells any of the farms they purchase. A) Draw a transition diagram for this Markov process and determine whether the associated Markov chain is absorbing. B) Write a transition matrix in standard form C) If neither company owns any farms at the beginning of this competitive buying process, estimate the percentage of farms that each company will purchase in the long run.

Transition Diagram We see that states A and B are absorbing Transition Diagram We see that states A and B are absorbing. We also see that the associated Markov chain is absorbing, since there are two absorbing states A and B and it is possible to go from the non-absorbing state C to either A or B in one step.

We use the transition diagram to write a transition matrix that is in standard form: .

Initial State Matrix; Successive state matrices At the beginning of this process, all of the farmers are in state C since no land has been sold initially. Therefore, we have the following initial state matrix: The successive state matrices are given as well. Do the state matrices have a limit? To find out, we compute the 10th state matrix and look at the product. We conjecture the limit of the state matrices is [ 0.2 0.8] .

Theorem 2: Limiting Matrices for Absorbing Markov chains If a standard form P for an absorbing Markov chain is partitioned as then approaches a limiting matrix as k increases where The matrix F is given by F = and is called the fundamental matrix for P The identity matrix used to form the fundamental matrix F must be the same size as the matrix Q.

Using the Formulas to find the limiting matrix We identify the display the appropriate matrices: Use the formula to find the matrix F. Q is a 1x1 matrix. Next, determine FR Display the limiting matrix. Notice that the last row is the same the kth state matrix. .