Sanov’s asks how likely is it that the empirical distribution some IIDRV’s is far from the distribution. And shows that the relative entropy determines the likelihood of being far.
Entropy and Relative Entropy occur sufficiently often in these notes to justify a (somewhat) self-contained section. We cover the discrete case which is the most intuitive.
- A heuristic look at the stochastic integral.
- heuristic derivation of Itô’s formula.
- Markov’s Inequality; Chebychev’s Inequality; Chernoff’s Bound.
- Bounds for the Poisson Distribution.