Solutions Markov Chains 2

Slides:



Advertisements
Similar presentations
Lecture 5 This lecture is about: Introduction to Queuing Theory Queuing Theory Notation Bertsekas/Gallager: Section 3.3 Kleinrock (Book I) Basics of Markov.
Advertisements

ST3236: Stochastic Process Tutorial 9
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
INDR 343 Problem Session
Markov Analysis Jørn Vatn NTNU.
Continuous Random Variables. L. Wang, Department of Statistics University of South Carolina; Slide 2 Continuous Random Variable A continuous random variable.
Lecture 12 – Discrete-Time Markov Chains
Solutions Markov Chains 1
What is the probability that the great-grandchild of middle class parents will be middle class? Markov chains can be used to answer these types of problems.
INDR 343 Problem Session
Reliable System Design 2011 by: Amir M. Rahmani
Replacement Problem.
Final Exam Due: December 14 (Noon), 2004
Solutions Queueing Theory 1
CDA6530: Performance Models of Computers and Networks Examples of Stochastic Process, Markov Chain, M/M/* Queue TexPoint fonts used in EMF. Read the TexPoint.
Class 3 Binomial Random Variables Continuous Random Variables Standard Normal Distributions.
Generalized Semi-Markov Processes (GSMP)
Intro. to Stochastic Processes
Chapter 2 Machine Interference Model Long Run Analysis Deterministic Model Markov Model.
Conditional Probability.  So far for the loan project, we know how to: Compute probabilities for the events in the sample space: S = {success, failure}.
Section 1.1, Slide 1 Copyright © 2014, 2010, 2007 Pearson Education, Inc. Section 13.3, Slide 1 13 Probability What Are the Chances?
Solutions Markov Chains 2 3) Given the following one-step transition matrix of a Markov chain, determine the classes of the Markov chain and whether they.
Quality Tools. Decision Tree When to use it Use it when making important or complex decisions, to identify the course of action that will give the best.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 19 Markov Decision Processes.
1 Markov chains and processes: motivations Random walk One-dimensional walk You can only move one step right or left every time unit Two-dimensional walk.
Copyright © 2010 Pearson Education, Inc. Slide
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Problems Markov Chains 2
Dependent Events. Sometimes the outcome of an event can depend directly on the outcome of another event. Sometimes the outcome of an event can depend.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
To be presented by Maral Hudaybergenova IENG 513 FALL 2015.
ST3236: Stochastic Process Tutorial 6
Markov Processes What is a Markov Process?
Markov Games TCM Conference 2016 Chris Gann
Chapter 14 Probability Rules!. Do Now: According to the 2010 US Census, 6.7% of the population is aged 10 to 14 years, and 7.1% of the population is aged.
Reliability Engineering
Problems Markov Chains ) Given the following one-step transition matrices of a Markov chain, determine the classes of the Markov chain and whether.
Renewal Theory Definitions, Limit Theorems, Renewal Reward Processes, Alternating Renewal Processes, Age and Excess Life Distributions, Inspection Paradox.
Discrete-time Markov chain (DTMC) State space distribution
Math 4030 – 4a More Discrete Distributions
TM 661 Chapter 5 Solutions 1 5.4) Consider the net cash flows and salvage values for each of alternatives 1 and 2 having lives 3 and 5 years respectively.
Much More About Markov Chains
Continuous Random Variables
NATCOR Stochastic Modelling Course Inventory Control – Part 1
Probability Distributions for Discrete Variables
Finite M/M/1 queue Consider an M/M/1 queue with finite waiting room.
Solutions Markov Chains 1
Probabilities and Proportions
TexPoint fonts used in EMF.
Solutions Markov Chains 1
Solutions Queueing Theory 1
Problem Markov Chains 1 A manufacturer has one key machine at the core of its production process. Because of heavy use, the machine.
Solutions Markov Chains 1
Solutions Markov Chains 1
Reading Street Comprehension Skills: Cause and Effect
Instructor: Vincent Conitzer
Solutions Queueing Theory 1
Planning and Scheduling
State Machines 8-May-19.
State Machines 16-May-19.
Discrete-time markov chain (continuation)
Solutions Markov Chains 6
Chapter 2 Machine Interference Model
TexPoint fonts used in EMF.
IE 360: Design and Control of Industrial Systems I
Random Processes / Markov Processes
CS723 - Probability and Stochastic Processes
When Intuition Fails Us
Reading Street Comprehension Skills: Cause and Effect
Lecture 5 This lecture is about: Introduction to Queuing Theory
Presentation transcript:

Solutions Markov Chains 2 4) A computer is inspected at the end of every hour. It is found to be either working (up) or failed (down). If the computer is found to be up, the probability of its remaining up for the next hour is 0.90. It it is down, the computer is repaired, which may require more than one hour. Whenever, the computer is down (regardlewss of how long it has been down), the probability of its still being down 1 hour later is 0.35. a. Construct the one-step transition probability matrix. b. Find the expected first passage time from i to j for all i, j. Soln: Let, S = 0 if computer is down = 1 if computer is up Then, b. Find expected first passage times

Solutions Markov Chains 3 4) (cont.) b. Find expected first passage times

Solutions Markov Chains 8 4) (cont.) Alternative solution to b. (1) (2) (3) (4) Substituting (4) into (1) gives And from (4),

Solutions Markov Chains 4 4) (cont.) Alternative solution to b. . 13 p = . 87 1 p =

Solutions Markov Chains 10 5) A manufacturer has a machine that, when operational at the beginning of a day, has a probability of 0.1 of breaking down sometime during the day. When this happens, the repair is done the next day and completed at the end of that day. a. Formulate the evolution of the status of the machine as a 3 state Markov Chain. b. Fine the expected first passage times from i to j. c. Suppose the machine has gone 20 full days without a breakdown since the last repair was completed. How many days do we expect until the next breakdown/repair? Soln: a. Let, S = 0 if machine running at day’s end | running at start = 1 if machine down at day’s end | running at start = 2 if machine running at day’s end | down at start

Solutions Markov Chains 5 5) (cont.) a. S = 0 if machine running at day’s end | running at start = 1 if machine down at day’s end | running at start = 2 if machine running at day’s end | down at start Continuing in this fashion gives the one-step transition prob. b. Note: This makes intuitive sense. If the machine has a 10% chance of failing on any given day, then the expected number of days between failures is 10, (m01 = 10).

Solutions Markov Chains 6 5) (cont.)

Solutions Markov Chains 7 5) (cont.) Back substituting for m02, m10, m11

Solutions Markov Chains 8 5) (cont.) c. Suppose the machine has gone 20 full days without a breakdown since the last repair was completed. How many days do we expect until the next breakdown/repair? If we read this as the expected number of days to breakdown since the last repair, we are asking for m21 in which case, If we read this as the expected number of days to breakdown and subsequent repair since the last repair, we are asking for m22. Again, this should make intuitive sense. A machine has a 10% chance of breaking down. Therefore, the expected time between failures is 10 days. Since it takes 1 day to repair, the time from repair to repair is 10+1 = 11 days.

Solutions Markov Chains 9 A military maintenance depot overhauls tanks. There is room for 3 tanks in the facility and one tank in an overflow area. At most 4 tanks can be at the depot at one time. Every morning a tank arrives for an overhaul. If the depot is full, however, it is turned away, no new arrivals occur under these circumstances. On any given day, the following probabilities govern the completion of overhauls. No. tanks 0 1 2 3 Prob. .2 .4 .3 .1 These values are independent of the number of tanks in the depot, but obviously no more tanks than are waiting in line at the start of the day can be completed. Develop a Markov chain model for this situation. Soln: Let S = # tanks in the depot at the start of a day just after a tank arrival = 1, 2, 3, 4 (note, since 1 arrives each day, we can never have S=0 Start State Event End State Prob. 4 overhaul 0, 1 declined 4 .2 4 overhaul 1, 1 arrives 4 .4 4 overhaul 2, 1 arrives 3 .3 4 overhaul 3, 1 arrives 2 .1 3 overhaul 0. 1 arrives 4 .2 3 overhaul 1, 1 arrives 3 .4 3 overhaul 2, 1 arrives 2 .3 3 overhaul 3, 1 arrives 1 .1

Solutions Markov Chains 10 Start State Event End State Prob. 4 overhaul 0, 1 declined 4 .2 4 overhaul 1, 1 arrives 4 .4 4 overhaul 2, 1 arrives 3 .3 4 overhaul 3, 1 arrives 2 .1 3 overhaul 0. 1 arrives 4 .2 3 overhaul 1, 1 arrives 3 .4 3 overhaul 2, 1 arrives 2 .3 3 overhaul 3, 1 arrives 1 .1 2 overhaul 0, 1 arrives 3 .2 2 overhaul 1, 1 arrives, 2 .4 2 overhaul all, 1 arrives 1 .3+.1 1 overhaul 0, 1 arrives 2 .2 1 overhaul all, 1 arrives 1 .4+.3+.1

Solutions Markov Chains 10 Suppose we count at end of day, then Start State Event End State Prob. 4 1 declined, overhaul 0 4 .2 4 1 declined, overhaul 1 3 .4 4 1 declined, overhaul 2 2 .3 4 1 declined, overhaul 3 1 .1 3 1 arrives, overhaul 0 4 .2 3 1 arrives, overhaul 1 3 .4 3 1 arrives, overhaul 2 2 .3 3 1 arrives, overhaul 3 1 .1 2 1 arrives, overhaul 0 3 .2 2 1 arrives, overhaul 1 2 .4 2 1 arrives, overhaul 2 1 .3 2 1 arrives, overhaul 3 0 .1 1 1 arrives, overhaul 0 2 .2 1 1 arrives, overhaul 1 1 .4 1 1 arrives, overhaul all 0 .3+.1 0 1 arrives, overhaul 0 1 .2 0 1 arrives, overhaul all 0 .4+.3+.1