Presentation is loading. Please wait.

Presentation is loading. Please wait.

If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities.

Similar presentations


Presentation on theme: "If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities."— Presentation transcript:

1 If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities like They are called the finite-dimensional distributions(fdd’s) for the process X. Review

2 Kolmogorov’s consistency theorem Fdd’s must satisfy the following two conditions): (i) as (ii) for any permutation  of {1,...,n} Then a probability space and a stochastic process exists with these fdd’s.

3 Markov chains Consider a stochastic process (X n, n≥0) taking values in a discrete state space S. It is a Markov chain if The matrix P=(p ij ) is called the transition matrix. Theorem: P has nonnegative entries and all row sums are one. Such matrices are called stochastic.

4 Chapman-Kolmogorov are called n-step transition probabilities. The matrix of them is denoted P (n). Theorem (Chapman-Kolmogorov): P (n+m) = P (n) P (m) P (n) = P n Let  k (n) = P(X n = k) and  (n) = (  k (n) ). Then  (m+n) =  (n) P m In particular,  (n) =  (0) P n

5 A branching process Z 0 =1Z 1 =2Z 2 =4 Z 3 =3

6 Properties Let u n =P(Z n =0). Then u n =G X (u n-1 ) which is the smallest non- negative root of G X (s)=s. It is 1 if E(X) ≤ 1.

7 Classification of states A state i for a Markov chain X k is called persistent if and transient otherwise. Let and. j is persistent iff f jj =1. Let

8 Some results Theorem: (a)P ii (s)=1+F ii (s)P ii (s) (b)P ij (s)=F ij (s)P jj (s) for i ≠ j. Corollary: (a)State j is persistent if and then for all i. (b)State j is transient if and thenfor all i.

9 Mean recurrence time Let T i = min{n>0: X n = i} and  i = E(T i |X 0 =i). For a transient state  i = ∞. For a persistent state We call a recurrent state positive persistent if  i < ∞, null persistent otherwise. positive recurrent = non-null persistent

10 Communication Two states i and j communicate,, if for some m. i and j intercommunicate,, if and. Theorem: is an equivalence relation. Theorem: If then (a)i is transient iff j is transient (b)i is persistent iff j is persistent

11 Closed and irreducible sets A set C of states is closed if p ij =0 for all i in C, j not in C C is irreducible if for all i,j in C. Theorem: S=T+C 1 +C 2 +... where T are all transient, and the C i are irreducible disjoint closed sets of persistent states Note: The C i are the equivalence classes for

12 Stationary distribution Theorem: An irreducible chain has a stationary distribution  iff the states are positive persistent. Then  is unique and given by We compute  by solving  P= .

13 Reversible chains X is called reversible if its time-reversal Y has the same transition probabilities as X Reversible processes: or Law of detailed balance

14 Law of large numbers Let X be positive persistent. Then To estimate such an integral we can compute averages from a Markov chain with stationary distribution .

15 The Gibbs sampler Suppose f (=  ) is a function from S d to S. We generate a Markov chain with stationary distribution f by consecutively drawing from (called the full conditionals). The n’th step of the chain is the whole set of d draws from d different conditional distributions.

16 The Metropolis algorithm Let Q be a symmetric transition matrix. When in state x, the next state is chosen by the following: 1. Draw y from q x, 2. Calculate r=f(y)/f(x) 3. If r≥1 the next value is y 4. If r<1 go to y with probability r, stay at x with probability 1-r Clearly Markov.

17 The Markov property X(t) is a Markov process if for any n for all j, i 0,...,i n in S and any t 0 <t 1 <...<t n <t. The transition probabilities are homogeneous if p ij (s,t)=p ij (0,t-s). We will usually assume this, and write p ij (t).

18 The general birth process Probability of birth in (t, t+  t) when n individuals present n  t + o(  t). X(0)=1: so

19 A result Theorem: For any t>0 A process for which  P 1n (t) < 1 is called dishonest. There is positive probability that the process is not in the state space at time t. It has gone off to infinity, or exploded.

20 Forward equations so and or

21 Stationary distribution  is a stationary distribution for P t if  =  P t for all t. Theorem:  is stationary for P t iff  G = 0 (under suitable regularity conditions).

22 Construction The way the continuous time Markov chains work is: (1)Draw an initial value i 0 from  (0) (2)If, stay in i 0 for a random time which is (3)Draw a new state from the distribution where

23 Persistence A chain is irreducible if P ij (t) > 0 for some t and for all pairs i,j in S. Fact (Lévy dichotomy): Either P ij (t) > 0 for all t, or P ij (t) = 0 for all t. We call a state persistent if Let Y n = X(nh) be the discrete skeleton of X. Let Q be the transition matrix for Y so persistence in continuous time is the same as persistence for the discrete skeleton

24 Birth, death, immigration and emigration Let The  n are called death rates, and the n are called birth rates. The process is a birth and death process. If n = n + we have linear birth with immigration. If  n = (  +  )n we have linear death and emigration.

25 Poisson process Birth process with rate independent of the state # events in (0,t] is independent of # events in (t,t+s], X t has independent increments If we delete points in a Poisson process independently with probability 1-p, we get a Poisson process of rate p.

26 General definition Consider points in some space S, subset of R d. They constitute a Poisson point pattern if (i)N(A)~Po(  (A)) (ii)N(A) is independent of N(B) for disjoint A and B  () is called the mean measure. If we call  (s) the intensity function.

27 A conditional property Let N be a Poisson counting process with intensity (x). Suppose A is a set with, N(A)=n, and let Q(B)=  (B)/  (A) be a cumulative distribution. It has density (x)/  (A) Then the points in A have the same distribution as n points drawn independently from the distribution Q.

28 Brownian motion A Brownian motion process is a stochastic process having: (1)Independent increments (2)Stationary increments (3)Continuity: for any 

29 Properties of Brownian motion process X(t)~N(  t,  2 t) Continuous paths Finite squared variation Not bounded variation Not differentiable paths


Download ppt "If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities."

Similar presentations


Ads by Google