Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.

Slides:



Advertisements
Similar presentations
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Advertisements

Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Random Variables ECE460 Spring, 2012.
Statistics review of basic probability and statistics.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Chapter 5 Discrete Random Variables and Probability Distributions
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Review of Basic Probability and Statistics
Chapter 4 Discrete Random Variables and Probability Distributions
Chapter 1 Probability Theory (i) : One Random Variable
Engineering Probability and Statistics - SE-205 -Chap 4 By S. O. Duffuaa.
Probability Densities
Chapter 6 Continuous Random Variables and Probability Distributions
Probability Distributions
TDC 369 / TDC 432 April 2, 2003 Greg Brewster. Topics Math Review Probability –Distributions –Random Variables –Expected Values.
1 Review of Probability Theory [Source: Stanford University]
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Probability and Statistics Review
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Discrete Random Variables and Probability Distributions
Lecture II-2: Probability Review
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Exponential Distribution & Poisson Process
1 Exponential Distribution & Poisson Process Memorylessness & other exponential distribution properties; Poisson process and compound P.P.’s.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Simulation Output Analysis
Standard Statistical Distributions Most elementary statistical books provide a survey of commonly used statistical distributions. The reason we study these.
Random variables Petter Mostad Repetition Sample space, set theory, events, probability Conditional probability, Bayes theorem, independence,
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Chapter 4 Continuous Random Variables and their Probability Distributions The Theoretical Continuous Distributions starring The Rectangular The Normal.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
IRDM WS Chapter 2: Basics from Probability Theory and Statistics 2.1 Probability Theory Events, Probabilities, Random Variables, Distributions,
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling STATISTICS Random Variables.
Random Variables & Probability Distributions Outcomes of experiments are, in part, random E.g. Let X 7 be the gender of the 7 th randomly selected student.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
ENGR 610 Applied Statistics Fall Week 3 Marshall University CITE Jack Smith.
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Chapter 2 Probability, Statistics and Traffic Theories
Chapter 4. Random Variables - 3
Continuous Random Variables and Probability Distributions
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Renewal Theory Definitions, Limit Theorems, Renewal Reward Processes, Alternating Renewal Processes, Age and Excess Life Distributions, Inspection Paradox.
3. Random Variables (Fig.3.1)
Pertemuan ke-7 s/d ke-10 (minggu ke-4 dan ke-5)
Appendix A: Probability Theory
Chapter 7: Sampling Distributions
Lecture on Markov Chain
Multinomial Distribution
Machine Learning Week 4.
Further Topics on Random Variables: 1
Berlin Chen Department of Computer Science & Information Engineering
Presentation transcript:

Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis of Manufacturing Systems, Chapter 2

Chapter 02 Basic Probability Envision an experiment for which the result is unknown. The collection of all possible outcomes is called the sample space. A set of outcomes, or subset of the sample space, is called an event. A probability space is a three-tuple ( , , Pr) where  is a sample space,  is a collection of events from the sample space and Pr is a probability law that assigns a number to each event in . For any events A and B, Pr must satsify: –Pr(  ) = 1 –Pr(A)  0 –Pr(A C ) = 1 – Pr(A) –Pr(A  B) = Pr(A) + Pr(B), if A  B = . If A and B are events in  with Pr(B)  0, the conditional probability of A given B is

Chapter 03 Random Variables Discrete vs. Continuous Cumulative distribution function Density function Probability distribution (mass) function Joint distributions Conditional distributions Functions of random variables Moments of random variables Transforms and generating functions A random variable is “a number that you don’t know… yet” Sam Savage, Stanford University

Chapter 04 Functions of Random Variables Often we’re interested in some combination of r.v.’s –Sum of the first k interarrival times = time of the k th arrival –Minimum of service times for parallel servers = time until next departure If X = min(Y, Z) then –therefore, –and if Y and Z are independent, If X = max(Y, Z) then If X = Y + Z, its distribution is the convolution of the distributions of Y and Z. Find it by conditioning.

Chapter 05 Conditioning ( Wolff ) Frequently, the conditional distribution of Y given X is easier to find than the distribution of Y alone. If so, evaluate probabilities about Y using the conditional distribution along with the marginal distribution of X: –Example: Draw 2 balls simultaneously from urn containing four balls numbered 1, 2, 3 and 4. X = number on the first ball, Y = number on the second ball, Z = XY. What is Pr(Z > 5)? –Key: Maybe easier to evaluate Z if X is known

Chapter 06 Convolution Let X = Y+Z. If Y and Z are independent, –Example: Poisson –Note: above is cdf. To get density, differentiate:

Chapter 07 Moments of Random Variables Expectation = “average” Variance = “volatility” Standard Deviation Coefficient of Variation

Chapter 08 Linear Functions of Random Variables Covariance Correlation If X and Y are independent then

Chapter 09 Transforms and Generating Functions Moment-generating function Laplace transform (nonneg. r.v.) Generating function (z – transform) Let N be a nonnegative integer random variable;

Chapter 010 Special Distributions Discrete –Bernoulli –Binomial –Geometric –Poisson Continuous –Uniform –Exponential –Gamma –Normal

Chapter 011 Bernoulli Distribution “Single coin flip” p = Pr(success) N = 1 if success, 0 otherwise

Chapter 012 Binomial Distribution “n independent coin flips” p = Pr(success) N = # of successes

Chapter 013 Geometric Distribution “independent coin flips” p = Pr(success) N = # of flips until (including) first success Memoryless property: Have flipped k times without success;

Chapter 014 z-Transform for Geometric Distribution Given P n = (1-p) n-1 p, n = 1, 2, …., find Then,

Chapter 015 Poisson Distribution “Occurrence of rare events” = average rate of occurrence per period; N = # of events in an arbitrary period

Chapter 016 Uniform Distribution X is equally likely to fall anywhere within interval (a,b) a b

Chapter 017 Exponential Distribution X is nonnegative and it is most likely to fall near 0 Also memoryless; more on this later…

Chapter 018 Gamma Distribution X is nonnegative, by varying parameter b get a variety of shapes When b is an integer, k, this is called the Erlang-k distribution, and Erlang-1 is same as exponential.

Chapter 019 Normal Distribution X follows a “bell-shaped” density function From the central limit theorem, the distribution of the sum of independent and identically distributed random variables approaches a normal distribution as the number of summed random variables goes to infinity.

Chapter 020 m.g.f.’s of Exponential and Erlang If X is exponential and Y is Erlang-k, Fact: The mgf of a sum of independent r.v.’s equals the product of the individual mgf’s. Therefore, the sum of k independent exponential r.v.’s (with the same rate ) follows an Erlang-k distribution.

Chapter 021 Stochastic Processes Poisson process Continuous time Markov chains A stochastic process is a random variable that changes over time, or a sequence of numbers that you don’t know yet.

Chapter 022 Stochastic Processes Set of random variables, or observations of the same random variable over time: X t may be either discrete-valued or continuous-valued. A counting process is a discrete-valued, continuous- parameter stochastic process that increases by one each time some event occurs. The value of the process at time t is the number of events that have occurred up to (and including) time t.

Chapter 023 Poisson Process Letbe a stochastic process where X(t) is the number of events (arrivals) up to time t. Assume X(0)=0 and (i) Pr(arrival occurs between t and t+  t) = where o(  t) is some quantity such that (ii) Pr(more than one arrival between t and t+  t) = o(  t) (iii) If t < u < v < w, then X(w) – X(v) is independent of X(u) – X(t). Let p n (t) = P(n arrivals occur during the interval (0,t). Then …

Chapter 024 Poisson Process and Exponential Dist’n Let T be the time between arrivals. Pr(T > t) = Pr(there are no arrivals in (0,t) = p 0 (t) = Therefore, that is, the time between arrivals follows an exponential distribution with parameter = the arrival rate. The converse is also true; if interarrival times are exponential, then the number of arrivals up to time t follows a Poisson distribution with mean and variance equal to t.

Chapter 025 When are Poisson arrivals reasonable? 1.The Poisson distribution can be seen as a limit of the binomial distribution, as n , p  0 with constant =np. -many potential customers deciding independently about arriving (arrival = “success”), -each has small probability of arriving in any particular time interval 2.Conditions given above: probability of arrival in a small interval is approximately proportional to the length of the interval – no bulk arrivals 3.Amount of time since last arrival gives no indication of amount of time until the next arrival (exponential – memoryless)

Chapter 026 More Exponential Distribution Facts 1.Suppose T 1 and T 2 are independent with Then 2.Suppose (T 1, T 2, …, T n ) are independent with Let Y = min(T 1, T 2, …, T n ). Then 3.Suppose (T 1, T 2, …, T k ) are independent with Let W= T 1 + T 2 + … + T k. Then W has an Erlang-k distribution with density function

Chapter 027 Continuous Time Markov Chains A stochastic process with possible values (state space) S = {0, 1, 2, …} is a CTMC if “The future is independent of the past given the present” Define Then

Chapter 028 CTMC Another Way 1.Each time X(t) enters state j, the sojourn time is exponentially distributed with mean 1/q j 2.When the process leaves state i, it goes to state j  i with probability p ij, where Let Then

Chapter 029 CTMC Infinitesimal Generator The time it takes the process to go from state i to state j Then q ij is the rate of transition from state i to state j, The infinitesimal generator is

Chapter 030 Long Run (Steady State) Probabilities Let Under certain conditions these limiting probabilities can be shown to exist and are independent of the starting state; They represent the long run proportions of time that the process spends in each state, Also the steady-state probabilities that the process will be found in each state. Then or, equivalently,

Chapter 031 Phase-Type Distributions Erlang distribution Hyperexponential distribution Coxian (mixture of generalized Erlang) distributions