- Laplacian; Adjoints; Harmonic fn; Green’s fn; Forward Eqn; Backward Eqn.
- Markov Chains and Martingales; Green’s Functions and occupancy; Potential functions; time-reversal and adjoints.
Category: Probability
Spitzer’s Lyapunov Ergodicity
We show that relative entropy decreases for continuous time Markov chains.
A Mean Field Limit
We consider a system consisting of interacting objects. As we let the number of objects increase, we can characterize the limiting behaviour of the system.
Cross Entropy Method
In the Cross Entropy Method, we wish to estimate the likelihood
Here is a random variable whose distribution is known and belongs to a parametrized family of densities
. Further
is often a solution to an optimization problem.
Sanov’s Theorem
Sanov’s asks how likely is it that the empirical distribution some IIDRV’s is far from the distribution. And shows that the relative entropy determines the likelihood of being far.
Entropy and Boltzmann’s Distribution
Entropy and Relative Entropy occur sufficiently often in these notes to justify a (somewhat) self-contained section. We cover the discrete case which is the most intuitive.
Ito’s Formula: a heuristic derivation
- A heuristic look at the stochastic integral.
- heuristic derivation of Itô’s formula.