# Continuous Probability Distributions

We consider distributions that have a continuous range of values. Discrete probability distributions where defined by a probability mass function. Analogously continuous probability distributions are defined by a probability density function.

(This is a section in the notes here.)

Definition [Probability Density Function] A probability density function (pdf) is a function $f: \mathbb R \rightarrow \mathbb R_+$ that has two properties

• (Positive) For $x \in \mathbb R$

• (Integrates to one)

From this we can define the following.

Definition [Continuous Probability Distribution] A random variable $X$ with values in $\mathbb R$ has a continuous probability distribution with pdf $f(x)$ if

As before, $F(x)$ is called the cumulative distribution function (CDF). As before it satisfies

• $0 \leq F(x) \leq 1$
• $F(x)$ is non-decreasing

and also it satisfies $F'(x) = f(x)$

A key observation is that when making the conceptual switch from (discrete) probability mass functions to (continuous) probability density distributions, we have replaced summations with integration. This is the main difference, and since most properties of sums apply to integrals1 many properties follow over for continuous random variables.

A few observations. Notice that the Equation is a consequence of the Fundamental Theorem of Calculus. Note while a pmf must be bounded above by $1$, in principle, a pdf can be unbounded.2 Notice we may want to restrict a continuous random variable to a range of values. For example, we may want to assume that our random variable is positive, in this case the pdf will satisfy $f(x) = 0$ for $x < 0$. Notice it does not make sense to think of a continuous random variable as taking any specific value since the integral of a point is zero. Instead we often think of the random variable belonging to some range of values. For instance, for $a, we have

Joint distributions. We can consider the pdf for two random variables (or more). If $X$, $Y$ are continuous random variables (defined on the same probability space) then their joint pdf is a function $f(x,y)$ such that

• For $x,y \geq 0$,  $f(x,y) \geq 0$

and from this If $X$ and $Y$ are independent then the joint pdf is the product of the pdfs

$f(x,y) = f_X(x) f_Y(y).$

All other the above extends out to more than two random variables $X_1,...,X_n$ in the way you might naturally expect. E.g. the pdf is a function of the form $f(x_1,...,x_n)$.

## Expectations

Analogous to the expectation in discrete random variables we have the following definition.

Definition [Expectation, continuous case] The expectation of a continuous random variable $X$ is given by

Similarly the variance is defined much as before

The following proposition is an amalgamation of the lemmas that we had for discrete random variables.

The proof of the above result really follow by an almost identical proof to the earlier discrete results. Just replace the summations with integrals. For that reason we omit the proof of this proposition.

## The Normal Distribution

The normal distribution arrises in many situations involving measurement. E.g. the distributions of heights, the relative change in a stock index, the measurement of physical phenomena (e.g. a comet passing the sun), the result from an election poll, the distribution of heat.

The normal distribution is, perhaps, the most important probability distribution. Why is this? Well roughly because it is the distributions that arises when you add up lots of small independent errors. This is more formally states as a result called the central limit theorem, which we will discuss shortly.

Definition [Standard Normal Distribution] The standard normal distribution has probability density function for $-\infty < x < \infty$. If a random variable $Z$ is a standard normal random variable we write $Z\sim \mathcal N(0,1)$. The cumulative distribution function is

It can be shown that a standard normal random variable has mean $0$ and variance $1$. By shifting and scaling we can acheive other values for the mean and variance.

Definition [Normal Distribution] The normal distribution with mean $\mu$ and variance $\sigma^2$ has probability density function

for $-\infty < x <\infty$. If $X$ is a normally distributed random variable with mean and variance $\sigma^2$ then we write $X \sim \mathcal N( \mu , \sigma^2)$ .

An useful point is that Thus we see that a normal random variable is simply a standard normal random variable that has been rescaled (by $\sigma$) and shifted (by $\mu$).

1. After all integrals are just fancy sums.
2. It is possible to let $f(x) = \infty$ for some values but we ignore such cases for now.