Foster’s Lemma provides a natural condition to prove the positive recurrence of a Markov chain.

  • Let X=(X_n)_{n\in\mathbb Z_+} be a discrete-time Markov chain on countable, irreducible state-space \mathcal X.

Theorem [Foster’s Lemma] If L:\mathcal X\rightarrow\mathbb R_+ is a function with \mathbb E L(X_0) <\infty and such that for some K>k\geq 0 and some \epsilon>0, \mathcal X(k)=\{x\in\mathcal X: L(x)\leq k\} is finite, and

then X is positive recurrent.

Consider a process that jumps to K then, at each unit of time, moves down \epsilon until it moves below k and then subsequently jumps up to K again. For every one unit of time below k, this process spends (K-k)/\epsilon units of time above k. So the proportion of time the process is below k is about \epsilon/(\epsilon + (K-k)). In other words, the process is positive recurrent in states less than or equal to k. This is Foster’s Lemma in a nutshell; however, the conditions of Forster’s Lemma says that on average this is the worst that can happen to L(X_n). Need to turn this intuition about our average process to a rigorous proof about our stochastic process.

  • We will often refer to the function L used in Foster’s Lemma as a Lyapunov function.

Proof. For any discrete time Markov chain \lim_{N\rightarrow\infty}\frac{1}{N}\mathbb E \sum_{n=1}^N\mathbb I[X_n=x] exists and, moreover, the limit is positive iff x is positive recurrent.

Rearranging this expression and taking limits, we get that

Thus since \mathcal X(b)=\{x\in\mathcal X: L(x)\leq k\} is finite, for some x\in\mathcal X(b),

In other words, x is positive recurrent and thus our irreducible Markov chain X is positive recurrent.

There are numerous generalizations of Foster’s Lemma, but all roughly follow this structure.

We consider one such generalization, the Multiplicative Foster’s Lemma. Here we allow the Markov chain to have a downward drift over a period of time proportional to the size of its initial state. This version of Foster’s Lemma will be useful when we take a deterministic process as the limit of our stochastic process.

  • Let X=(X(t))_{t\in\mathbb R_+} be a continuous-time Markov chain on countable, irreducible state-space \mathcal X.
  • We let |\cdot| define a norm on \mathcal X. 1

Theorem [Multiplicative Foster’s Lemma] If \mathbb E |X(0)| <\infty and if for some K>k\geq 0, some \epsilon>0 and some T>0, \mathcal X(k)=\{x\in\mathcal X: |x|\leq k\} is finite, and

where \tau is the next jump time of the Markov chain then X is positive recurrent.

  • When we take a fluid limit, we rescaled space and time proportionally to get a deterministic limit. Observe this result remains invariant when we rescaled space and time proportionally.

Proof. We consider our Markov chain at specific stopping times: t_0:=0 and for n\in\mathbb N,

where \tau(t_{n-1}) is the time to the next jump of our Markov chain after time t_{n-1}. Similar to Foster’s Lemma Rearranging, dividing by N and taking limits, we gain the expression

Thus, X is positive recurrent as it is positive recurrent in the finite set of state \mathcal X(k)=\{x\in\mathcal X: |x|\leq k\}.

  1. We do not really need a norms here. Just some notion of distance, but to be safe we assume it is a norm.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: