next up previous
Postscript version of these notes

STAT 380 Lecture 19

End of Summary for Continuous Time Markov Chains:

Detailed development

Suppose X a Markov Chain with stationary transitions. Then

align48

This shows

displaymath357

which is the Chapman-Kolmogorov equation.

Now consider the chain starting from i and let tex2html_wrap_inline361 be the first t for which tex2html_wrap_inline365 . Then tex2html_wrap_inline361 is a stopping time.

[Technically:

displaymath369

for each t.] Then

align50

by the Markov property.

Conclusion: given X(0)=i, tex2html_wrap_inline361 has memoryless porperty so tex2html_wrap_inline361 has an exponential distribution. Let tex2html_wrap_inline379 be the rate parameter.

Embedded Chain: Skeleton

Let tex2html_wrap_inline381 be the stopping times at which tranisitons occur. Then tex2html_wrap_inline383 . The sequence tex2html_wrap_inline385 is a Markov chain by the Markov property. That tex2html_wrap_inline387 reflects the fact that tex2html_wrap_inline389 by design.

As before we say tex2html_wrap_inline391 if tex2html_wrap_inline393 for some t. It is fairly clear that tex2html_wrap_inline391 for the X(t) if and only if tex2html_wrap_inline391 for the embedded chain tex2html_wrap_inline385 .

We say tex2html_wrap_inline405 (i and j communicate) if tex2html_wrap_inline391 and tex2html_wrap_inline413 .

Now consider

displaymath415

Suppose the chain has made n transitions so far so that tex2html_wrap_inline419 . Then the event X(t+h)=j is, except for possibilities of probability o(h) the event that

displaymath425

The probability of this is

displaymath427

Kolmogorov's Equations

The Chapman-Kolmogorov equations are

displaymath429

Subtract tex2html_wrap_inline431 from both sides, divide by h and let tex2html_wrap_inline435 . Remember that tex2html_wrap_inline437 is the identity. We find

displaymath439

which gives

displaymath441

The Chapman-Kolmogorov equations can also be written

displaymath443

Now subtracting tex2html_wrap_inline431 from both sides, dividing by h and letting tex2html_wrap_inline435 gives

displaymath451

Look at these equations in component form:

displaymath451

becomes

displaymath455

For tex2html_wrap_inline457 our calculations of instantaneous transition rates gives

displaymath459

For i = k we have

displaymath463

(X(h)=i either means tex2html_wrap_inline467 which has probability tex2html_wrap_inline469 or there have been two or more transitions in [0,h], a possibility of probability o(h).) Thus

displaymath475

Let tex2html_wrap_inline477 be the matrix with entries

displaymath479

tex2html_wrap_inline477 is the infinitesimal generator of the chain.

Thus

displaymath451

becomes

align85

Called Kolmogorov's backward equations.

On the other hand

displaymath441

becomes

align95

These are Kolmogorov's forward equations.

Remark: When the state space is infinite the forward equations may not be justified. In deriving them we interchanged a limit with an infinite sum; the interchange is always justified for the backward equations but not for forward.

Example: tex2html_wrap_inline487 . Then

displaymath489

and the chain is otherwise specivied by tex2html_wrap_inline491 and tex2html_wrap_inline493 . The matrix tex2html_wrap_inline477 is

displaymath497

The backward equations become

align111

while the forward equations are

align125

Add first and third backward equations to get

displaymath499

so

displaymath501

Put t=0 to get tex2html_wrap_inline505 . This gives

displaymath507

Plug this back in to the first equation and get

displaymath509

Multiply by tex2html_wrap_inline511 and get

displaymath513

which can be integrated to get

displaymath515

Alternative calculation:

displaymath497

can be written as

displaymath519

where

displaymath521

displaymath523

and

displaymath525

Then

align187

Now

displaymath527

so we get

align209

where

displaymath529

Notice: rows of tex2html_wrap_inline531 are a stationary initial distribution. If rows are tex2html_wrap_inline345 then

displaymath535

so

displaymath537

Moreover

displaymath539

Fact: tex2html_wrap_inline541 is long run fraction of time in state 0.

Fact:

displaymath543

Ergodic Theorem in continuous time.

Birth and Death Processes

Consider a population of X(t) individuals. Suppose in next time interval (t,t+h) probability of population increase of 1 (called a birth) is tex2html_wrap_inline549 and probability of decrease of 1 (death) is tex2html_wrap_inline551 .

Jargon: X is a birth and death process.

Special cases:

All tex2html_wrap_inline339 ; called a pure birth process.

All tex2html_wrap_inline341 (0 is absorbing): pure death process.

tex2html_wrap_inline559 and tex2html_wrap_inline561 is a linear birth and death process.

tex2html_wrap_inline563 , tex2html_wrap_inline565 : Poisson Process.

tex2html_wrap_inline567 and tex2html_wrap_inline561 is a linear birth and death process with immigration.

Queuing Theory

Ingredients of Queuing Problem:

1: Queue input process.

2: Number of servers

3: Queue discipline: first come first serve? last in first out? pre-emptive priorities?

4: Service time distribution.

Example: Imagine customers arriving at a facility at times of a Poisson Process N with rate tex2html_wrap_inline573 . This is the input process, denoted M (for Markov) in queuing literature.

Single server case:

Service distribution: exponential service times, rate tex2html_wrap_inline577 .

Queue discipline: first come first serve.

X(t) = number of customers in line at time t.

X is a Markov process called M/M/1 queue:

displaymath587

displaymath589

Example: tex2html_wrap_inline591 queue:

Customers arrive according to PP rate tex2html_wrap_inline573 . Each customer begins service immediately. X(t) is number being served at time t. X is a birth and death process with

displaymath601

and

displaymath603


next up previous



Richard Lockhart
Friday November 3 09:36:13 PST 2000