Spitzer’s Lyapunov Ergodicity

We show that relative entropy decreases for continuous time Markov chains.

Consider aan irreducible continuous time Markov chain with state space {\mathcal X}, Q-matrix Q and stationary distribution \boldsymbol{\pi}. The forward equation for this Markov chain is given by

Throughout this section we assume that \boldsymbol{p}(t) and \boldsymbol{q}(t) are two solutions to the above system of differential equations.

Recall that the relative entropy between distributions \boldsymbol{p} and \boldsymbol{q} is

Ex 1. Show that, for positive numbers a and b,

with equality iff b=a.

Ex 2. Show that for any function f(y),

Ex 3. Show that

where we define a_x= p_x(t)/q_x(t).


Ex 4. Show that

with equality iff \boldsymbol{p}(t) = \boldsymbol{q}(t).


Ex 5. If \boldsymbol{\pi} is the stationary distribution, then

(From here standard Lyapunov argument can be used to prove ergodicity.)


Ans 1. The function a\log a/b is strictly convex. The remaining terms are the tangent to this function at a=b.

Ans 2. Trivial since \sum_x Q_{yx}=0.

Ans 3. Define a_x = p_x /q_x.

In the 3rd equality we apply the forward equation . In the 4th equality we apply [2] with f(y)=-a_y \log a_y.

Ans 4. The inequality holds by applying [1] to [3].

For the equality condition suppose that \frac{d}{dt} D(\boldsymbol{p}(t) || \boldsymbol{q}(t) ) = 0. By irreducibility \boldsymbol{p}(y)>0. Take two states that commute x,y i.e. Q_{xy}>0. For this x and y the term in the sum given in [3] can only be zero if a_x=a_y. By irreducibility, this holds for all pairs thus \boldsymbol{p}=\boldsymbol{q}.

Ans 5. Obvious.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: