A discrete-time single-server queue

Many queues can be modeled as discrete-time Markov chains. Here, the state is the length of the queue, and fluctuations in arrivals and service give the random evolution from one time period to the next. Here is a diagram of a queue.

Figure: arrivals occur from the left and are stored in a buffer. Jobs or customers are represented as rectangles. A server is represented by a circle, which processes the jobs, and they then depart on the right-hand side.

Continue reading “A discrete-time single-server queue”

Markov chains and matrices — A Quick Summary

Probabilities for a Markov chain can be expressed in terms of the probability vector \boldsymbol{\lambda} and the transition matrix P.

For example, for x, x_0, \ldots, x_t \in \mathcal{X} and a function r : \mathcal{X} \rightarrow \mathbb{R},

\displaystyle \mathbb{P}(X_0 = x_0, \ldots, X_t = x_t) = \lambda_{x_0} P_{x_0 x_1} \cdots P_{x_{t-1} x_t}

\displaystyle \mathbb{P}(X_t = x) = [\lambda P^t]_x

\displaystyle \mathbb{E}_x[r(X_t)] = (P^t r)_x

So essentially, multiplication to the left gives probabilities and multiplication to the right gives expectations.

Notation explanation. If the above is unclear, some notation may need explaining. For P^t = P \times P \cdots \times P, we multiply t copies of the matrix P together.1 Interpreting \boldsymbol{\lambda} as a row vector, \boldsymbol{\lambda} P^t is a row vector, and [\boldsymbol{\lambda} P^t]_x is the x-th component of that row vector.

Similarly, we can think of the function r : \mathcal{X} \rightarrow \mathbb{R} as a column vector \boldsymbol{r} = (r(x) : x \in \mathcal{X}). Then P^t \boldsymbol{r} is a column vector under matrix multiplication, and (P^t \boldsymbol{r})_x denotes its x-th component.

Throughout these notes, \mathbb{P}_x(\cdot) = \mathbb{P}(\cdot \mid X_0 = x) and \mathbb{E}_x[\cdot] = \mathbb{E}[\cdot \mid X_0 = x].

1 We use \top for transpose, not t.

Discrete-time Markov chains — A Quick Look

Markov chains are the primary stochastic process used in queueing theory. The main elements of a Markov chain are its state and the independent randomness used to generate future states. This creates a very clean and natural form of dependence for which many properties are known. We will cover a few important properties of Markov chains, including recurrence, transience, and the existence of stationary distributions.

Continue reading “Discrete-time Markov chains — A Quick Look”

Zero-Order Stochastic Optimization: Keifer-Wolfowitz

We want to optimize the expected value of some random function. This is the problem we solved with Stochastic Gradient Descent. However, we assume that we no longer have access to unbiased estimate of the gradient. We only can obtain estimates of the function itself. In this case we can apply the Kiefer-Wolfowitz procedure.

The idea here is to replace the random gradient estimate used in stochastic gradient descent with a finite difference. If the increments used for these finite differences are sufficiently small, then over time convergence can be achieved. The approximation error for the finite difference has some impact on the rate of convergence.

Continue reading “Zero-Order Stochastic Optimization: Keifer-Wolfowitz”

Continuous Probability Distributions

We consider distributions that have a continuous range of values. Discrete probability distributions where defined by a probability mass function. Analogously continuous probability distributions are defined by a probability density function.

(This is a section in the notes here.)

Continue reading “Continuous Probability Distributions”

Discrete Probability Distributions

There are some probability distributions that occur frequently. This is because they either have a particularly natural or simple construction. Or they arise as the limit of some simpler distribution. Here we cover

  • Bernoulli random variables
  • Binomial distribution
  • Geometric distribution
  • Poisson distribution.

(This is a section in the notes here.)

Continue reading “Discrete Probability Distributions”