Markov Chains Recurrence

We quickly recap the recurrence properties of Markov chains. The aim is to give a short review not a full rigorous treatment.

Recurrence and Transience

For a Markov chain (X_t : t \in \mathbb{Z}_+), we say that a state x is

\displaystyle \text{recurrent} \quad \text{if} \quad \sum_{t=1}^{\infty} \mathbb{I}[X_t = x] = \infty \quad \text{with probability 1}

\displaystyle \text{transient} \quad \text{if} \quad \sum_{t=1}^{\infty} \mathbb{I}[X_t = x] < \infty \quad \text{with probability 1}

\displaystyle \text{positive recurrent} \quad \text{if} \quad \liminf_{T \to \infty} \frac{1}{T} \sum_{t=1}^{T} \mathbb{I}[X_t = x] > 0 \quad \text{with probability 1}

\displaystyle \text{null recurrent} \quad \text{if it is recurrent but not positive recurrent}

Thus, a state x is either visited infinitely often (recurrent) or finitely often (transient). If it is recurrent and a positive proportion of time is spent at x, then it is positive recurrent; otherwise, it is null recurrent.

Figure: a Venn diagram of the definitions above.

In queueing systems, we typically want the empty state to be positive recurrent, while transience would be undesirable. Null recurrence can be viewed as a borderline case.


Hitting Times and Recurrence

We can characterize recurrence in terms of return times. Let

\displaystyle \tau_x := \min \{ t > 0 : X_t = x \}

and define

\displaystyle h_x = \mathbb{P}_x(\tau_x < \infty), \quad H_x = \mathbb{E}_x[\tau_x]

Then we have the following characterizations.

Proposition. The state x is

\displaystyle \text{recurrent} \iff h_x = 1

\displaystyle \text{transient} \iff h_x < 1

\displaystyle \text{positive recurrent} \iff H_x < \infty

These quantities can be computed using the transition matrix P.

For x_0 \neq x, define

\displaystyle h(x_0) = \mathbb{P}_{x_0}(\tau_x < \infty), \quad H(x_0) = \mathbb{E}_{x_0}[\tau_x]

and set h(x) = 1, H(x) = 0.

Theorem. For any x \in \mathcal{X}, the function h(\cdot) satisfies

\displaystyle h(x_0) = \sum_{y \in \mathcal{X}} P_{x_0 y} h(y)

and H(\cdot) satisfies

\displaystyle H(x_0) = 1 + \sum_{y \in \mathcal{X}} P_{x_0 y} H(y)

Proof (sketch).

\displaystyle h(x_0) = \sum_{x_1 \in \mathcal{X}} P_{x_0 x_1} h(x_1)

This follows by conditioning on the first step and using the Markov property.


Communication and Irreducibility

Definition (Communicate). Two states x and y communicate if

\displaystyle \mathbb{P}(X_t = y \mid X_0 = x) > 0 \quad \text{and} \quad \mathbb{P}(X_s = x \mid X_0 = y) > 0

for some t and s.

The recurrence properties of states are determined by communication.

Proposition. If two states communicate, then they are either both positive recurrent, both null recurrent, or both transient.

In many queueing systems, all states communicate with the empty state. Such chains are typically irreducible.

Definition (Irreducible). A Markov chain is irreducible if all states communicate.

For irreducible Markov chains, recurrence is a property of the entire chain rather than individual states.

Leave a comment