Saturday Agenda Interfaces – Johnson, Robert R. How Bayer Makes Decisions to Develop New Drugs – Isbrandt, Derek A Major League Baseball Team Uses Operations.

Slides:



Advertisements
Similar presentations
MARKOV ANALYSIS Andrei Markov, a Russian Mathematician developed the technique to describe the movement of gas in a closed container in 1940 In 1950s,
Advertisements

Discrete time Markov Chain
. Markov Chains. 2 Dependencies along the genome In previous classes we assumed every letter in a sequence is sampled randomly from some distribution.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Section 4.11: Markov Chains Jiaping Wang Department of Mathematical.
Hidden Markov Models (1)  Brief review of discrete time finite Markov Chain  Hidden Markov Model  Examples of HMM in Bioinformatics  Estimations Basic.
Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
Operations Research: Applications and Algorithms
. Markov Chains as a Learning Tool. 2 Weather: raining today40% rain tomorrow 60% no rain tomorrow not raining today20% rain tomorrow 80% no rain tomorrow.
1 Markov Chains (covered in Sections 1.1, 1.6, 6.3, and 9.4)
. Computational Genomics Lecture 7c Hidden Markov Models (HMMs) © Ydo Wexler & Dan Geiger (Technion) and by Nir Friedman (HU) Modified by Benny Chor (TAU)
. Hidden Markov Models - HMM Tutorial #5 © Ydo Wexler & Dan Geiger.
Topics Review of DTMC Classification of states Economic analysis
Stevenson and Ozgur First Edition Introduction to Management Science with Spreadsheets McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies,
Solutions Markov Chains 1
G12: Management Science Markov Chains.
10.3 Absorbing Markov Chains
Chapter 17 Markov Chains.
Markov Processes MBAP 6100 & EMEN 5600 Survey of Operations Research Professor Stephen Lawrence Leeds School of Business University of Colorado Boulder,
To accompany Quantitative Analysis for Management, 9e by Render/Stair/Hanna 16-1 © 2006 by Prentice Hall, Inc. Upper Saddle River, NJ Chapter 16.
Markov Chain Part 2 多媒體系統研究群 指導老師:林朝興 博士 學生:鄭義繼. Outline Review Classification of States of a Markov Chain First passage times Absorbing States.
Lecture 13 – Continuous-Time Markov Chains
Markov Analysis Chapter 16
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 60 Chapter 8 Markov Processes.
Markov Chains Chapter 16.
INDR 343 Problem Session
Chapter 01 Introduction to Probability Models Course Focus Textbook Approach Why Study This?
Copyright ©2005 Brooks/Cole, a division of Thomson Learning, Inc. Understanding Probability and Long-Term Expectations Chapter 16.
CDA6530: Performance Models of Computers and Networks Examples of Stochastic Process, Markov Chain, M/M/* Queue TexPoint fonts used in EMF. Read the TexPoint.
Abstract Sources Results and Conclusions Jessica Porath  Department of Mathematics  University of Wisconsin-Eau Claire Faculty Mentor: Dr. Don Reynolds.
Business Modeling Lecturer: Ing. Martina Hanová, PhD.
CH – 11 Markov analysis Learning objectives:
The 29th Conference ARAB-ACRAO 29 March - 3 April 2009 Comparison of Student Flow in Different Colleges of Kuwait University Using Absorbing Markov Analysis.
Lecture 4: State-Based Methods CS 7040 Trustworthy System Design, Implementation, and Analysis Spring 2015, Dr. Rozier Adapted from slides by WHS at UIUC.
Solutions Markov Chains 2 3) Given the following one-step transition matrix of a Markov chain, determine the classes of the Markov chain and whether they.
Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < < tk < tk+1 If X(t) is discrete-valued If X(t) is continuous-valued i.e.
Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004 Assignment 3.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 19 Markov Decision Processes.
CIA Annual Meeting LOOKING BACK…focused on the future.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Data Collection 1.
A shared random effects transition model for longitudinal count data with informative missingness Jinhui Li Joint work with Yingnian Wu, Xiaowei Yang.
Markov Chains. What is a “Markov Chain” We have a set of states, S = {s1, s2,...,sr}. A process starts in one of these states and moves successively from.
Productivity. Operations Management Module 3 Productivity Efficiency Effectiveness Markov chains.
Abstract Sources Results and Conclusions Jessica Porath  Department of Mathematics  University of Wisconsin-Eau Claire Faculty Mentor: Dr. Don Reynolds.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Markov Processes What is a Markov Process?
Problems Markov Chains 1 1) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether they.
Business Modeling Lecturer: Ing. Martina Hanová, PhD.
Business Modeling Lecturer: Ing. Martina Hanová, PhD.
Chapter 9: Markov Processes
Markov Chains Applications. Brand Switching  100 customers currently using brand A 84 will stay with A, 9 will switch to B, 7 will switch to C  100.
Advanced Statistical Computing Fall 2016
Markov Chains.
Introduction to Health Systems Engineering
Markov Chains Applications
Probability Distributions for Discrete Variables
Quick Check In 2002, Consumer Reports published an article evaluating refrigerators. It listed 41 models, giving the brand, cost, size (cu ft), type.
Solutions Markov Chains 1
Chapter 5 Markov Analysis
TexPoint fonts used in EMF.
Solutions Markov Chains 1
Markov Processes Markov Chains 1/16/2019 rd.
Solutions Markov Chains 1
Solutions Markov Chains 2
Lial/Hungerford/Holcomb/Mullins: Mathematics with Applications 11e Finite Mathematics with Applications 11e Copyright ©2015 Pearson Education, Inc. All.
Slides by John Loucks St. Edward’s University.
Discrete-time markov chain (continuation)
Discrete-time markov chain (continuation)
TexPoint fonts used in EMF.
CS723 - Probability and Stochastic Processes
Random Processes / Markov Processes
Presentation transcript:

Saturday Agenda Interfaces – Johnson, Robert R. How Bayer Makes Decisions to Develop New Drugs – Isbrandt, Derek A Major League Baseball Team Uses Operations Research to Improve Draft Preparation – Agnew, Joseph G. Kaizen and Stochastic Networks Support the Investigation of Aircraft Failures – Tran, Binh Patient-Centered Care: A Simulation Model to Compare Strategies forthe Reduction of Health-Care- Associated Markov Chains Networks Big Data and Analytics (Guest lecturers) STAT 5802 – Markov Chains

Markov Chains Andrey Markov

Markov Chains General Description We want to describe the behavior of a system as it moves (makes transitions) probabilistically from “state” to “state”. States may be qualitative or quantitative Basic Assumption – The future depends only on the present (current state) and not on the past. That is, the future depends on the state we are in, not on how we arrived at this state.

Example 1 - Brand loyalty or Market Share For ease, assume that all cola buyers purchase either Coke or Pepsi in any given week. That is, there is a duopoly. Assume that if a customer purchases Coke in one week there is a 90% chance that the customer will purchase Coke the next week (and a 10% chance that the customer will purchase Pepsi). Similarly, 80% of Pepsi drinkers will repeat the purchase from week to week.

Example 1 - Developing the Markov Matrix States – State 1 - Coke was purchased – State 2 - Pepsi was purchased – (note: states are qualitative) Markov (transition or probability) Matrix From\ToCokePepsi Coke Pepsi0.20.8

Example 1 – Understanding Movement From\ToCokePepsi Coke Pepsi Quiz: If we start with 100 Coke purchasers and 100 Pepsi purchasers, how many Coke purchasers will there be after 1 week?

Graphical Description – 1 The States From\To CokePepsi Coke.9.1 Pepsi.2.8

Graphical Description – 2 Transitions from Coke From\To CokePepsi Coke.9.1 Pepsi

Graphical Description – 3 All transitions From\To CokePepsi Coke.9.1 Pepsi

Example 1 - Starting Conditions Percentages – Identify probability of (percentage of shoppers) starting in either state (We will assume a 50/50 starting market share in our example that follows.) – Assume we start in one specific state (by setting one probability to 1 and the remaining probabilities to 0) Counts (numbers) – Identify number of shoppers starting in either state

Example 1 From\To Coke Pepsi Coke Pepsi Starting Probabilities = 50% (or 50 people) each Questions – What will happen in the short run (next 3 periods)? – What will happen in the long run? – Do starting probabilities influence long run?

Graphical Solution After 1 Transition From\To CokePepsi Coke.9.1 Pepsi.2.8 (50)Coke(55)(50)Pepsi(45).9(50)=45.1(50)=5.8(50)=40.2(50)=10

Graphical Solution After 2 Transitions From\To CokePepsi Coke.9.1 Pepsi.2.8 (55)Coke(58.5)(45)Pepsi(41.5).9(55)=49.5.1(55)=5.5.8(45)=36.2(45)=9

Graphical Solution After 3 Transitions (58.5)Coke(60.95)(41.5)Pepsi(39.05).9(58.5)= (58.5)=5.85.8(41.5)=33.2.2(41.5)=8.3

Analyzing Markov Chains Using QM for Windows – Module – Markov Chains Number of states – 2 Number of transitions - 3

Example 1 – After 3 transitions n-step Transition probabilities End of Period 1CokePepsi Coke Pepsi End prob (given initial) End of Period 2 CokePepsi Coke Pepsi End prob (given initial) End of Period 3 CokePepsi Coke Pepsi End prob (given initial) step transition matrix 2 step transition matrix 3 step transition matrix

Example 1 - Results (3 transitions, start =.5,.5) From\To Coke Pepsi Coke Pepsi Ending probability Steady State probability Note: We end up alternating between Coke and Pepsi 3 step transition matrix Depends on initial conditions Independent of initial conditions

Example 2 - Student Progression Through a University States – Freshman – Sophomore – Junior – Senior – Dropout – Graduate – (note: again, states are qualitative)

Example 2 - Student Progression Through a University - States FreshmanSophomoreJuniorSenior Drop outGraduate Note that eventually you must end up in Grad or Drop-out.

Example 2 – Results Lazarus paper data First yrSophJuniorSeniorGradDrop out First year Sophomore Junior Senior Graduate Drop out End prob Steady State

From the paper If there are an equal number of freshmen, sophomores, juniors and seniors at the beginning of an academic year then The percentage of this mixed group of students who will graduate is ( )/4 = 91%

Classification of states Absorbing – Those states such that once you are in you never leave. Graduate, Drop Out Recurrent – Those states to which you will always both leave and return at some time. Coke, Pepsi Transient – States that you will eventually never return to Freshman, Sophomore, Junior, Senior

State Classification Exercise State 1State 2State 3 State 4State 5 Absorbing Recurrent Transient

State Classification Article “A non-recursive algorithm for classifying the states of a finite Markov chain” European Journal of Operational Research Vol 28, 1987

Example 3 - Diseases States – no disease – pre-clinical (no symptoms) – clinical – death – (note: again states are qualitative) Purpose – Transition probabilities can be different for different testing or treatment protocols

Example 4 - Customer Bill paying States – State 0: Bill is paid in full – State i: Bill is in arrears for i months, i= 1,2,…,11 – State 12: Deadbeat

Example 5 - Oil Market State – State 0 - oil market is normal – State 1 - oil market is mildly disrupted – State 2 - oil market is severely disrupted – State 3 - oil production is essentially shut down – Note: States are qualitative “Tensions around Iran have been supporting the prices in the past years, but the impact might diminish as the United States has said it was ready to release fuel from its Strategic Petroleum Reserve to protect the world economy should oil prices spike.”Iraneconomy

Example 6 – HIV infections Based on “Can Difficult-to-Reuse Syringes Reduce the Spread of HIV among Injection Drug Users” – Caulkins, et. al. – Interfaces, Vol 28, No. 3, May-June 1998, pp State – State 0 – Syringe is uninfected – State 1 – Syringe is infected Notes: – P(0, 1) =.14 14% of drug users are infected with HIV – P(1, 0) = % of the time the virus dies; 33% of the time it is killed by bleaching

Example 7 – Mental Health Lazarus depressed manic euthymic/remitted mortality

Example 8 - Baseball States – State 0 - no outs, bases empty – State 1 - no outs, runner on first – State 2 - no outs, runner on second – State 3 - no outs, runner on third – State 4 - no outs, runners on first, second – State 5 - no outs, runners on first, third – State 6 - no outs, runners on second, third – State 7 - no outs, runners on first, second, third – …. Repeat for 1 out and 2 outs for a total of 24 states Moneyball by Michael Lewis, p 134 Moneyball

Example 9 – Football Overtime Playoffs (no time limit) States – Team A has ball – Team B has ball – Team A scores (absorbing) – Team B scores (absorbing) “Win, Lose, or Draw: A Markov Chain Analysis of Overtime in the National Football League”, Michael A. Jones, The College Mathematics Journal, Vol. 35, No. 5, November 2004, pp

EXERCISE STAT 5802 – Markov Chains

Vinayak, Prior, Shinkus, Capozzi, Ltd is considering leasing one of two possible machines – either from Wang, Inc. or from Koger, Inc.. At the beginning of each day, either machine can be found in operating condition or nonoperating condition. The daily transition matrix of the two machines under identical maintenance is given below: For example, if the Wang machine is operating at the beginning of one day there is a 95% chance that it will be operating at the beginning of the next day. Assume that everything (leasing costs, repair costs, etc.) except for the transition matrices are identical for the two machines. Which machine should Capozzi lease and why? WANGOperatingNonopera ting KOGEROperatingNonoperatin g Operating.95.05Operating Nonoperating.90.10Nonoperating.85.15

STAT 5802 – Markov Chains Assume that at the beginning of work on Monday the Wang machine is in operating condition. The probability that the machine will be in operating condition at the beginning of Friday (4 days later) is _______________. Assume that at the beginning of work on Monday the Wang machine is in operating condition. The probability that the machine will be in operating condition at the beginning of all 4 days from Monday to Friday is __________________. In the context of this problem explain the meaning of a recurrent state. ower right corner) which means that a new machine must be purchased. Write the new transition matrix for Koger. What type of state is the new state? ____________________________ What type of state are operating and nonoperating? ____________________ WANGOperatingNonopera ting KOGEROperatingNonoperatin g Operating.95.05Operating Nonoperating.90.10Nonoperating.85.15

STAT 5802 – Markov Chains Suppose that the original model should have included the fact that for the Koger model if a machine is inoperable on one day that there is a 10% chance that it may be nonoperating the next day and a 5% chance it may be totally unrepairable (rather than the 15% chance that is listed in the lower right corner) which means that a new machine must be purchased. Write the new transition matrix for Koger. What type of state is the new state? What type of state are operating and nonoperating? WANGOperatingNonopera ting KOGEROperatingNonoperatin g Operating.95.05Operating Nonoperating.90.10Nonoperating.85.15

Markov Chains The end