This post is a follow up on the last post on
moment generating functions (mgfs), where we showed that if the mgf of
a probability measure $\mu$ on $\mathbb{R}$ (also known as a
distribution) exists on some neighborhood of $0$, then the
moments of $\mu$ uniquely determine $\mu$. Let's quickly review the
relevant definitions and theorems from the last post.
Definition.
For a probability measure $\mu$ on $\mathbb{R}$ the $n$th
moment is defined as
$\int_{-\infty}^{\infty}x^n\mathrm{d}\mu(x)$ with the $n$th
absolute moment of $\mu$ being
$\int_{-\infty}^{\infty}|x|^n\mathrm{d}\mu(x)$. Here $n$ is any
non-negative integer (The $0$th moment is just $1$ for every $\mu$).
Definition.
The moment generating function or mgf of a
probability measure $\mu$ on $\mathbb{R}$ is the function
if the integral is well defined for all $t$ in an open interval
$(-R, R)$. If not, then we say that the mgf of $\mu$ doesn't
exists.
And we had the following proposition that relates mgfs back to
moments:
Proposition.
Suppose the moment generating function exists for the probability
measure $\mu$, then for all integers $n > 0$, we have that the $n$th
moment of $\mu$ is finite and is given by
$$
\at{\libn[]{t}}{t=0}M_{\mu}(t).
$$
And then we proved the following theorem:
Theorem.
Let $\mu$ be a probability measure on $\mathbb{R}$, and suppose the
mgf $M_{\mu}(t)$ exists in some interval $(-R, R)$. Then the
moments $a_n \defeq \int_{-\infty}^{\infty}x^n\mathrm{d}\mu(x)$ are
finite and uniquely determine $\mu$.
Example Moment Generating Functions
In this section we will compute the moment generating functions of
common continuous distributions. We will describe our distributions
(i.e. probability measures) with probability density functions,
or pdfs. Recall that a function $f \colon \mathbb{R} \to
\mathbb{R}$ is a pdf if $f(x) \ge 0$ for all $x \in \mathbb{R}$ and
$\int_{-\infty}^{\infty}f(x)\mathrm{d}x = 1$. The Radon-Nikodym
stated below gives the conditions for when a probability measure $\mu$
has a pdf.
Theorem (Radon-Nikodym).
Suppose that $\mu$ is a probability measure on $\mathbb{R}$ that is
absolutely
continuous. Then there exists a function $f \colon \mathbb{R}
\to [0, \infty)$ such that for any $A \in \mathcal{B}$, we have
is a pdf with parameters $-\infty < \mu < \infty$ and $0 <
\sigma < \infty$. The associated distribution is called the
\emph{Gaussian (normal) distribution} and is denoted as
$\mathcal{N}(\theta, \sigma^2)$. For the Gaussian distribution
$\mathcal{N}(\mu, \sigma^2)$ we see that the mgf is
Now that we know some example moment generating functions,
lets show their true power. Let's recall some background on
random variables first.
Suppose $(S, \mathcal{A},\mu)$ is a probability space. We call any
measurable function $X\colon S \to \mathbb{R}$ a random
variable, where $\mathbb{R}$ is equipped with the Borel sets
$\mathcal{B}$. The random variable $X$ induces a probability measure
$\mu_X$ on $\mathbb{R}$ of the form
$$
\mu_X(A) = \mu(X^{-1}(A))
$$
where $A \in \mathcal{B}$. The measure $\mu_X$ is call the
pushforward measure. Suppose $\nu$ is a probability measure
on $\mathbb{R}$. We say $X$ has distribution $\nu$ or $X \sim \nu$ if
$\mu_X = \nu$. We call a set $\{X_i\}_{i\in I}$ of random variables
identically distributed if each random variable $X_i$ for $i
\in I$ has the distribution $\mu$. Recall that a collection of
measurable sets $\{A_{i}\}_{i \in I}$ of $S$ is called
independent if for any finite subcollection $\{A_{i_1}, \dots
A_{i_n}\}$ we have
Furthermore we say a collection of random variables $\{X_i\}_{i \in
I}$ over $S$ is independent if for any Borel measurable set
$B \subseteq \mathbb{R}$ we have that the collection
$\{X_{i}^{-1}(B)\}_{i \in I}$ is independent. Lastly we say a
collection $\{X_i\}_{i \in I}$ of random variables over $S$ iid
if its both independent and identically distributed.
Proposition.
Let $X_1, \dots, X_n$ be a set of independent random variables over
some probability space $(S, \mathcal{A}, \mu)$, with mgfs
$M_{X_1}(t), \dots, M_{X_n}(t)$. Let $Y = X_1 + \cdots + X_n$ Then
the mgf of $Y$ exists and equals
Proposition.
Let $X_1, \dots, X_n$ be independent random variables over the
probability space $(S, \mathcal{A}, \mu)$ such that $X_i \sim
\mathcal{N}(\mu_i, \sigma_i^2)$. Let $Y = X_1 + \cdots + X_n$, then
$Y \sim \mathcal{N}(\mu_1 + \cdots + \mu_n, \sigma_1^2 + \cdots +
\sigma_n^2)$.
Proof.
We compute the mgf of $Y$ using \eqref{eq:norm} to see
Uniqueness of mgfs give us $Y \sim \mathcal{N}(\mu_1 + \cdots +
\mu_n, \sigma_1^2 + \cdots + \sigma_n^2)$ and the proof is
complete.
Proposition.
Let $X_1, \dots, X_n$ be independent random variables over the
probability space $(S, \mathcal{A}, \mu)$ such that $X_i \sim
\gammad(\alpha_i, \beta)$ (Note that $\beta$ is fixed). Let $Y =
X_1 + \cdots + X_n$, then $Y \sim \gammad(\alpha_1 + \cdots +
\alpha_n, \beta)$.
Proof.
We compute the mgf of $Y$ using \eqref{eq:gamma} to see
Uniqueness of mgfs give us $Y \sim \gammad(\alpha_1 + \cdots +
\alpha_n, \beta))$ and the proof is complete.
The $\chi^2$ Distribution
We define the $\chi^2$ distribution with $p$ degrees of freedom as
$\gammad(p/2, 2)$.
Before we get to our first examples, let's make an observation. Let
$Y = X^2$ for some random variable with pdf $f_X$. For $y > 0$, we
see for cdf $F_Y$ of $Y$
respectfully with $x > 0$ and $f_1(x) = f_2(x) = 0$ for $x \le 0$.
Note that $\mu_1$ has the lognormal distribution. We see for
fixed non-negative integer $n$
Taking $n = 0$, we verify that $f_1$ is indeed a pdf. Furthermore, we
see that the $n$th moment of $\mu_1$ is $ e^{\frac{n^2}{2}}$.
For $f_2(x)$ we see
Now the integrand in \eqref{eq:ex3} is an odd function and it's
trivial to see that the improper integral is well-defined (i.e. not in the
$\infty - \infty$ form), and so \eqref{eq:ex3} is evaluated as $0$.
Therefore by \eqref{eq:ex2}, we have $\mu_2$ has identical moments
as $\mu_1$.