Markov Chains Applications. Brand Switching  100 customers currently using brand A 84 will stay with A, 9 will switch to B, 7 will switch to C  100.

Slides:



Advertisements
Similar presentations
MARKOV ANALYSIS Andrei Markov, a Russian Mathematician developed the technique to describe the movement of gas in a closed container in 1940 In 1950s,
Advertisements

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Section 4.11: Markov Chains Jiaping Wang Department of Mathematical.
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Succession of a Riparian Forest Andrea M. Shea Supervisor: Kermit the Frog Department of Biological Sciences California State University, Sacramento ABSTRACT.
Markov Analysis Jørn Vatn NTNU.
Solutions Markov Chains 1
Markov Analysis. Overview A probabilistic decision analysis Does not provide a recommended decision Provides probabilistic information about a decision.
1. Markov Process 2. States 3. Transition Matrix 4. Stochastic Matrix 5. Distribution Matrix 6. Distribution Matrix for n 7. Interpretation of the Entries.
INDR 343 Problem Session
Lecture 13 – Continuous-Time Markov Chains
Replacement Problem.
Queuing Models Basic Concepts
1 Markov Chains Tom Finke. 2 Overview Outline of presentation The Markov chain model –Description and solution of simplest chain –Study of steady state.
Chapter 11 Replacement Decisions
Chapter 15 Replacement Decisions
HW # Due Day: Nov. 9.
Markov Chains Chapter 16.
INDR 343 Problem Session
QUEUING MODELS Queuing theory is the analysis of waiting lines It can be used to: –Determine the # checkout stands to have open at a store –Determine the.
Internet Queuing Delay Introduction How many packets in the queue? How long a packet takes to go through?
Education Phase 3 Food price and food choice. Global food prices Since 2005, food prices have risen globally. Year average *
Dimitrios Konstantas, Evangelos Grigoroudis, Vassilis S. Kouikoglou and Stratos Ioannidis Department of Production Engineering and Management Technical.
CH12- WIENER PROCESSES AND ITÔ'S LEMMA
Wiener Processes and Itô’s Lemma Chapter 12 1 Options, Futures, and Other Derivatives, 7th Edition, Copyright © John C. Hull 2008.
Chapter 13 Wiener Processes and Itô’s Lemma
CH – 11 Markov analysis Learning objectives:
1 Operations Research Prepared by: Abed Alhameed Mohammed Alfarra Supervised by: Dr. Sana’a Wafa Al-Sayegh 2 nd Semester ITGD4207 University.
10.1 Options, Futures, and Other Derivatives, 4th edition © 1999 by John C. Hull Model of the Behavior of Stock Prices Chapter 10.
Contemporary Engineering Economics, 6 th edition Park Copyright © 2016 by Pearson Education, Inc. All Rights Reserved Replacement Analysis Fundamentals.
Day trading is short term trading A day trade is (usually) entered and exited within the same trading day.
BIA 2610 – Statistical Methods Chapter 5 – Discrete Probability Distributions.
Solutions Markov Chains 2 3) Given the following one-step transition matrix of a Markov chain, determine the classes of the Markov chain and whether they.
Lecture No. 46 Chapter 14 Contemporary Engineering Economics Copyright © 2010 Contemporary Engineering Economics, 5th edition, © 2010.
Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < < tk < tk+1 If X(t) is discrete-valued If X(t) is continuous-valued i.e.
1 Wiener Processes and Itô’s Lemma MGT 821/ECON 873 Wiener Processes and Itô’s Lemma.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 19 Markov Decision Processes.
8/14/04J. Bard and J. W. Barnes Operations Research Models and Methods Copyright All rights reserved Lecture 12 – Discrete-Time Markov Chains Topics.
E-Business And The Distribution Network
| 1 Universal is a Member of the Panasonic Group A custom presentation for: Distributor Name by Your Name Selecting the Right Customer Level.
1 Bargaining & Markets u Two populations: Buyers and Sellers u A sellers has 1 indivisible unit to sell u A Buyer has 1 unit of divisible money u If they.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
DEPARTMENT OF MECHANICAL ENGINEERING VI-SEMESTER OPERATION RESEARCH 1 UNIT:- V REPLACEMENT OF MODEL.
Problems Markov Chains 1 1) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether they.
© 2001 Prentice-Hall, Inc.Chap 8-1 BA 201 Lecture 12 Confidence Interval Estimation.
Chapter 9: Markov Processes
Chapter 13 Wiener Processes and Itô’s Lemma 1. Stochastic Processes Describes the way in which a variable such as a stock price, exchange rate or interest.
Problems Markov Chains ) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether.
Markov Chains.
Markov Chains Applications
Internet Queuing Delay Introduction
هزینه یابی بر مبنای هدف تهیه و تنظیم:. محمود بنی اسدی و
CPSC 531: System Modeling and Simulation
IENG 362 Markov Chains.
البيئة السياسية للإدارة الدولية
Chapman-Kolmogorov Equations
Solutions Markov Chains 1
Chapter 14 Wiener Processes and Itô’s Lemma

Slides by John Loucks St. Edward’s University.
IENG 362 Markov Chains.
Queuing Theory III.
Queuing Theory III.
Using TOS Charts to Determine Price Targets
Stochastic Processes A stochastic process is a model that evolves in time or space subject to probabilistic laws. The simplest example is the one-dimensional.
Discrete-time markov chain (continuation)
Solutions Markov Chains 6
CS723 - Probability and Stochastic Processes
Presentation transcript:

Markov Chains Applications

Brand Switching  100 customers currently using brand A 84 will stay with A, 9 will switch to B, 7 will switch to C  100 customers currently using brand B 78 will stay with B, 14 will switch to A, 8 will switch to C  100 customers currently using brand C 90 will stay with c, 4 will switch to A, 6 will switch to B

Brand Switching  States Brands of product  Time Period Interval between purchases  Transition Probabilities

Brand Switching  States Brands of product  Time Period Interval between purchases  Transition Probabilities ABC P  F H G G I K J J

Brand Switching If the current market share is given by: a.What is the market share after one time period ? b.What is long term market share ? P () (...) 

Brand Switching Market share in 1 time period P () (...) (...)  F H G G I K J J 

Brand Switching Long term market share

Brand Switching Long term market share P n () (...) (.  F H G G I K J J  )

Brand Switching PPP P nn ABC ABC ABC ()()() ()   F H G G I K J J 0 0    P A n AAA A A ().().().() (...)      

Brand Switching  Assumptions  Markovian Property Brand switching might take into consideration more than just the past brand; e.g. customer dissatisfaction with brand A led to B, marketing lead in led to C  Stationarity Property Marketing strategies may change transition probabilities over time

Stock Market Analysis  States Price of Stock: $0, 5, 10, 15, 20  Time increment Opening price on successive trading days  Transition Probabilities Prob going up 5 = a i = ¼ Prob going down 5 = b i = ¼ Prob staying same = c i = ½

Stock Market Analysis  Transition Probabilities FI P  H G G G G G G G G G G G K J J J J J J J J J J J

Stock Market Analysis  Assumptions  Markovian Property  Stationarity Property

Equipment Replacement  States State 1: New Filter State 2: One year old, no repairs State 3: Two years old, no repairs State 4: Repaired once  Time Period One year  Transition Probabilities Repaired filter will be replaced after one year Filters scrapped after 3 years use

Equipment Replacement  Transition Probabilities

Equipment Replacement  Transition Probabilities P  F H G G G G I K J J J J

Equipment Replacement  Steady State Probabilities P  F H G G G G I K J J J J        ......

Equipment Replacement  Steady State Probabilities P  F H G G G G I K J J J J                ....

Equipment Replacement

 Suppose that the following costs apply  New Filters$ 500  Repair Filter$ 150  Scrap filters ($ 50)(scrap salvage)  For a pool of 100 filters, on average, what is the expected cost of our repair policy ?

Equipment Replacement  Suppose that the following costs apply  New Filters$ 500  Repair Filter$ 150  Scrap filters ($ 50)(scrap salvage) ECostCj buyrepairsalvage j []() ()()()().().()(..)() $201.        / filter

Population Mobility  Forest consists of 4 major species  Aspen  Birch  Oak  Maple ABOM P  F H G G G G I K J J J J ABOMABOM

Population Mobility  States State 1: No movement State 2: Movement within region State 3: Movement out of a region State 4: Movements into a region  Time period  Transition Probabilities

Population Mobility  States: Movement at SDSM&T State 1: Student remains in major State 2: Student switches major State 3: Student leaves school (transfer out, matriculates) State 4: Student enters school (transfer in, first year studs.)  Time period Semester  Transition Probabilities

Population Mobility  States: Movement at SDSM&T State 1: Student in IE State 2: Student in other major State 3: Student leaves school (transfer out, matriculates) State 4: Student enters school (transfer in, first year studs.)  Time period Semester  Transition Probabilities P  F H G G G G I K J J J J