We show that relative entropy decreases for continuous time Markov chains.

# Category: Probability

## A Mean Field Limit

We consider a system consisting of interacting objects. As we let the number of objects increase, we can characterize the limiting behaviour of the system.

## Cross Entropy Method

In the *Cross Entropy Method*, we wish to estimate the likelihood

Here is a random variable whose distribution is known and belongs to a parametrized family of densities . Further is often a solution to an optimization problem.

## Sanov’s Theorem

Sanov’s asks how *likely* is it that the empirical distribution some IIDRV’s is *far* from the distribution. And shows that the relative entropy determines the likelihood of being far.

## Entropy and Boltzmann’s Distribution

Entropy and Relative Entropy occur sufficiently often in these notes to justify a (somewhat) self-contained section. We cover the discrete case which is the most intuitive.

## Ito’s Formula: a heuristic derivation

- A heuristic look at the stochastic integral.
- heuristic derivation of Itô’s formula.

## Basic Probability Bounds

- Markov’s Inequality; Chebychev’s Inequality; Chernoff’s Bound.
- Bounds for the Poisson Distribution.