next up previous
Postscript version of these notes

Continuous Time Markov Chains

Consider a population of single celled organisms in a stable environment.

Fix short time interval, length h.

Each organism has some probability of dividing to produce two organisms and some other probability of dying.

We might suppose:

Tacit assumptions:

Constants of proportionality do not depend on time: ``stable environment''.

Constants do not depend on organism: organisms are all similar and live in similar environments.

Y(t): total population at time t.

tex2html_wrap_inline60 : history of process up to time t.

Condition on event Y(t) =n .

Probability of two or more divisions (more than one division by a single organism or two or more organisms dividing) is o(h).

Probability of both a division and a death or of two or more deaths is o(h).

So probability of exactly 1 division by any one of the n organisms is tex2html_wrap_inline72 .

Similarly probability of 1 death is tex2html_wrap_inline74 .

We deduce:

align19

These equations lead to:

align21

This is the Markov Property.

Definition: A process tex2html_wrap_inline76 taking values in S, a finite or countable state space is a Markov Chain if

align24

Definition: A Markov chain X has stationary transitions if

displaymath82

From now on: our chains have stationary transitions.


next up previous



Richard Lockhart
Tuesday October 31 11:19:18 PST 2000