Sanov’s asks how *likely* is it that the empirical distribution some IIDRV’s is *far* from the distribution. And shows that the relative entropy determines the likelihood of being far.

# Category: Probability

## Entropy and Boltzmann’s Distribution

Entropy and Relative Entropy occur sufficiently often in these notes to justify a (somewhat) self-contained section. We cover the discrete case which is the most intuitive.

## Ito’s Formula: a heuristic derivation

- A heuristic look at the stochastic integral.
- heuristic derivation of Itô’s formula.

## Basic Probability Bounds

- Markov’s Inequality; Chebychev’s Inequality; Chernoff’s Bound.
- Bounds for the Poisson Distribution.