# Spitzer’s Lyapunov Ergodicity

We show that relative entropy decreases for continuous time Markov chains.

Consider aan irreducible continuous time Markov chain with state space ${\mathcal X}$, $Q$-matrix $Q$ and stationary distribution $\boldsymbol{\pi}$. The forward equation for this Markov chain is given by Throughout this section we assume that $\boldsymbol{p}(t)$ and $\boldsymbol{q}(t)$ are two solutions to the above system of differential equations.

Recall that the relative entropy between distributions $\boldsymbol{p}$ and $\boldsymbol{q}$ is Ex 1. Show that, for positive numbers $a$ and $b$, with equality iff $b=a$.

Ex 2. Show that for any function $f(y)$, Ex 3. Show that where we define $a_x= p_x(t)/q_x(t)$.

Ex 4. Show that with equality iff $\boldsymbol{p}(t) = \boldsymbol{q}(t)$.

Ex 5. If $\boldsymbol{\pi}$ is the stationary distribution, then (From here standard Lyapunov argument can be used to prove ergodicity.)

Ans 1. The function $a\log a/b$ is strictly convex. The remaining terms are the tangent to this function at $a=b$.

Ans 2. Trivial since $\sum_x Q_{yx}=0$.

Ans 3. Define $a_x = p_x /q_x$. In the 3rd equality we apply the forward equation . In the 4th equality we apply  with $f(y)=-a_y \log a_y$.

Ans 4. The inequality holds by applying  to .

For the equality condition suppose that $\frac{d}{dt} D(\boldsymbol{p}(t) || \boldsymbol{q}(t) ) = 0$. By irreducibility $\boldsymbol{p}(y)>0$. Take two states that commute $x,y$ i.e. $Q_{xy}>0$. For this $x$ and $y$ the term in the sum given in  can only be zero if $a_x=a_y$. By irreducibility, this holds for all pairs thus $\boldsymbol{p}=\boldsymbol{q}$.

Ans 5. Obvious.