Poisson Arrivals See Time Averages (PASTA)

We are often interested in the state an arriving customer observes in a queue. If arrivals are Poisson, then the average arrival sees the system in its equilibrium state.

This may seem counterintuitive. When a customer arrives, we might expect the system to be slightly emptier in anticipation of the arrival. However, this is not the case for Poisson processes: they are uniformly distributed over time, and arrivals after time t are independent of what happened before t.

Continue reading “Poisson Arrivals See Time Averages (PASTA)”

Kelly’s Lemma

Kelly’s Lemma provides a way of using the time reversal of a Markov chain to calculate its stationary distribution. In theory, it is just a rewrite of the stationary distribution and time-reversal results for continuous-time chains. However, in application, it is different; you can take a guess at the time reversal of a chain $Q’$, and if you are right, i.e., if it checks out in Kelly’s Lemma, you get the stationary distribution as a consequence.

Continue reading “Kelly’s Lemma”

Stationary Distributions (Continuous Time)

We quickly recap the results known for discrete time that also hold in continuous time. (The slight advantage from a modeling perspective is that in continuous time, transitions occur distinctly, which gives more chances for reversibility to hold)

Continue reading “Stationary Distributions (Continuous Time)”

Continuous-time Markov chains

We want to model queueing systems in continuous time. Continuous-time Markov chains are a natural generalization of discrete-time Markov chains. We focus on developing intuition for continuous-time chains as the natural continuous-time extension of discrete chains, similar to our discussion of Poisson processes.

Continue reading “Continuous-time Markov chains”

A discrete-time single-server queue

Many queues can be modeled as discrete-time Markov chains. Here, the state is the length of the queue, and fluctuations in arrivals and service give the random evolution from one time period to the next. Here is a diagram of a queue.

Figure: arrivals occur from the left and are stored in a buffer. Jobs or customers are represented as rectangles. A server is represented by a circle, which processes the jobs, and they then depart on the right-hand side.

Continue reading “A discrete-time single-server queue”

Markov chains and matrices — A Quick Summary

Probabilities for a Markov chain can be expressed in terms of the probability vector \boldsymbol{\lambda} and the transition matrix P.

For example, for x, x_0, \ldots, x_t \in \mathcal{X} and a function r : \mathcal{X} \rightarrow \mathbb{R},

\displaystyle \mathbb{P}(X_0 = x_0, \ldots, X_t = x_t) = \lambda_{x_0} P_{x_0 x_1} \cdots P_{x_{t-1} x_t}

\displaystyle \mathbb{P}(X_t = x) = [\lambda P^t]_x

\displaystyle \mathbb{E}_x[r(X_t)] = (P^t r)_x

So essentially, multiplication to the left gives probabilities and multiplication to the right gives expectations.

Notation explanation. If the above is unclear, some notation may need explaining. For P^t = P \times P \cdots \times P, we multiply t copies of the matrix P together.1 Interpreting \boldsymbol{\lambda} as a row vector, \boldsymbol{\lambda} P^t is a row vector, and [\boldsymbol{\lambda} P^t]_x is the x-th component of that row vector.

Similarly, we can think of the function r : \mathcal{X} \rightarrow \mathbb{R} as a column vector \boldsymbol{r} = (r(x) : x \in \mathcal{X}). Then P^t \boldsymbol{r} is a column vector under matrix multiplication, and (P^t \boldsymbol{r})_x denotes its x-th component.

Throughout these notes, \mathbb{P}_x(\cdot) = \mathbb{P}(\cdot \mid X_0 = x) and \mathbb{E}_x[\cdot] = \mathbb{E}[\cdot \mid X_0 = x].

1 We use \top for transpose, not t.