Markov Chains Applications

Slides:



Advertisements
Similar presentations
MARKOV ANALYSIS Andrei Markov, a Russian Mathematician developed the technique to describe the movement of gas in a closed container in 1940 In 1950s,
Advertisements

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Section 4.11: Markov Chains Jiaping Wang Department of Mathematical.
Succession Model We talked about the facilitation, inhibition, and tolerance models, but all of these were verbal descriptions Today we are going to consider.
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
Succession of a Riparian Forest Andrea M. Shea Supervisor: Kermit the Frog Department of Biological Sciences California State University, Sacramento ABSTRACT.
Markov Analysis Jørn Vatn NTNU.
1 Markov Chains (covered in Sections 1.1, 1.6, 6.3, and 9.4)
Solutions Markov Chains 1
Markov Analysis. Overview A probabilistic decision analysis Does not provide a recommended decision Provides probabilistic information about a decision.
Markov Chain Part 2 多媒體系統研究群 指導老師:林朝興 博士 學生:鄭義繼. Outline Review Classification of States of a Markov Chain First passage times Absorbing States.
INDR 343 Problem Session
Lecture 13 – Continuous-Time Markov Chains
1 Markov Chains Tom Finke. 2 Overview Outline of presentation The Markov chain model –Description and solution of simplest chain –Study of steady state.
Queuing Models Basic Concepts. QUEUING MODELS Queuing is the analysis of waiting lines It can be used to: –Determine the # checkout stands to have open.
HW # Due Day: Nov. 9.
Markov Chains Chapter 16.
INDR 343 Problem Session
QUEUING MODELS Queuing theory is the analysis of waiting lines It can be used to: –Determine the # checkout stands to have open at a store –Determine the.
 a fixed measure for a given population  ie: Mean, variance, or standard deviation.
Wiener Processes and Itô’s Lemma Chapter 12 1 Options, Futures, and Other Derivatives, 7th Edition, Copyright © John C. Hull 2008.
Chapter 13 Wiener Processes and Itô’s Lemma
CH – 11 Markov analysis Learning objectives:
1 Operations Research Prepared by: Abed Alhameed Mohammed Alfarra Supervised by: Dr. Sana’a Wafa Al-Sayegh 2 nd Semester ITGD4207 University.
NETE4631:Capacity Planning (2)- Lecture 10 Suronapee Phoomvuthisarn, Ph.D. /
10.1 Options, Futures, and Other Derivatives, 4th edition © 1999 by John C. Hull Model of the Behavior of Stock Prices Chapter 10.
1 Wiener Processes and Itô’s Lemma MGT 821/ECON 873 Wiener Processes and Itô’s Lemma.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 19 Markov Decision Processes.
1 Bargaining & Markets u Two populations: Buyers and Sellers u A sellers has 1 indivisible unit to sell u A Buyer has 1 unit of divisible money u If they.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Recoverable Service Parts Inventory Problems -Ibrahim Mohammed IE 2079.
Problems Markov Chains 1 1) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether they.
© 2001 Prentice-Hall, Inc.Chap 8-1 BA 201 Lecture 12 Confidence Interval Estimation.
Chapter 9: Markov Processes
Chapter 13 Wiener Processes and Itô’s Lemma 1. Stochastic Processes Describes the way in which a variable such as a stock price, exchange rate or interest.
Markov Chains Applications. Brand Switching  100 customers currently using brand A 84 will stay with A, 9 will switch to B, 7 will switch to C  100.
Problems Markov Chains ) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether.
Discrete-time Markov chain (DTMC) State space distribution
Availability Availability - A(t)
Forecasting Methods Dr. T. T. Kachwala.
Medium Access Control Protocols
Markov Chains.
Introduction to Health Systems Engineering
V5 Stochastic Processes
STEM Fair: Statistical Analysis
St. Edward’s University
St. Edward’s University
CPSC 531: System Modeling and Simulation
Product life cycle.
IENG 362 Markov Chains.
Markov Chains Carey Williamson Department of Computer Science
Chapman-Kolmogorov Equations
Solutions Markov Chains 1
Chapter 14 Wiener Processes and Itô’s Lemma
How Insurance Works Personal Finance.

Carey Williamson Department of Computer Science University of Calgary
Slides by John Loucks St. Edward’s University.
IENG 362 Markov Chains.
Queuing Theory III.
Queuing Theory III.
Marketing The activity, set of institutions and processes for creating, communicating, delivering, and exchanging offerings that have value for customers,
Discrete-time markov chain (continuation)
Chapter 5 Discrete Probability Distributions
Discrete-time markov chain (continuation)
Solutions Markov Chains 6
Stock Market Stage Analysis.
Random Processes / Markov Processes
CS723 - Probability and Stochastic Processes
Homework # 3 Due Wed May 26.
Presentation transcript:

Markov Chains Applications

Brand Switching 100 customers currently using brand A 84 will stay with A, 9 will switch to B, 7 will switch to C 100 customers currently using brand B 78 will stay with B, 14 will switch to A, 8 will switch to C 100 customers currently using brand C 90 will stay with c, 4 will switch to A, 6 will switch to B

Brand Switching States Time Period Transition Probabilities Brands of product Time Period Interval between purchases Transition Probabilities A B C P = 84 09 07 14 78 08 04 06 90 .

Brand Switching If the current market share is given by: What is the market share after one time period ? What is long term market share ? P ( ) . 39 32 29 =

Brand Switching Assumptions Markovian Property Brand switching might take into consideration more than just the past brand; e.g. customer dissatisfaction with brand A led to B, marketing lead in led to C Stationarity Property Marketing strategies may change transition probabilities over time

Stock Market Analysis Transition Probabilities 1 P = 4 2

Stock Market Analysis Assumptions Markovian Property Stationarity Property

Equipment Replacement States State 1: New Filter State 2: One year old, no repairs State 3: Two years old, no repairs State 4: Repaired once Time Period One year Transition Probabilities Repaired filter will be replaced after one year Filters scrapped after 3 years use

Equipment Replacement Transition Probabilities 1 3 2 4

Equipment Replacement Transition Probabilities 1 3 2 4 P = 7 3 4 6 5 1 .

Equipment Replacement = F H G I K J 7 3 4 6 5 1 . Steady State Probabilities p 1 3 4 2 5 7 6 = + . p 1 2 3 4 352 246 098 304 = .

Equipment Replacement

Equipment Replacement Suppose that the following costs apply New Filters $ 500 Repair Filter $ 150 Scrap filters ($ 50) (scrap salvage) For a pool of 100 filters, on average, what is the expected cost of our repair policy ?

Population Mobility Forest consists of 4 major species Aspen Birch Oak Maple A B O M P = . 05 08 03 84 00 80 20 35 53 10 85 A B O M

Population Mobility States Time period Transition Probabilities State 1: No movement State 2: Movement within region State 3: Movement out of a region State 4: Movements into a region Time period Transition Probabilities

Population Mobility States: Movement at SDSM&T Time period State 1: Student remains in major State 2: Student switches major State 3: Student leaves school (transfer out, matriculates) State 4: Student enters school (transfer in, first year studs.) Time period Semester Transition Probabilities

Population Mobility States: Movement at SDSM&T Time period State 1: Student in IE State 2: Student in other major State 3: Student leaves school (transfer out, matriculates) State 4: Student enters school (transfer in, first year studs.) Time period Semester Transition Probabilities P = 85 05 10 3 4 9 1 20 70 .