Ad Space — Top Banner

Moment Generating Function

Learn the moment generating function (MGF) formula, how to derive moments of probability distributions, with worked examples.

The Formula

MX(t) = E[etX]

The n-th moment: E[Xn] = MX(n)(0) = dn/dtn MX(t) evaluated at t = 0

The moment generating function (MGF) is a powerful tool in probability theory that encodes all the moments of a random variable into a single function. Given a random variable X, its MGF is defined as MX(t) = E[etX], which is the expected value of e raised to the power tX, where t is a real number.

The name "moment generating function" comes from its key property: you can extract any moment of the distribution by differentiating the MGF and evaluating at t = 0. The first derivative at t = 0 gives the mean (first moment), the second derivative gives E[X²] (from which you can compute the variance), the third derivative gives E[X³] (related to skewness), and so on.

One of the most valuable properties of the MGF is uniqueness. If two random variables have the same moment generating function, they have the same probability distribution. This makes MGFs extremely useful for proving that a random variable follows a particular distribution.

Another important property involves sums of independent random variables. If X and Y are independent, then MX+Y(t) = MX(t) · MY(t). This multiplicative property makes it easy to find the distribution of a sum of independent variables, which is far simpler than working with convolutions of density functions.

Not all random variables have a moment generating function. The MGF exists only if E[etX] is finite for some interval around t = 0. The Cauchy distribution, for instance, has no MGF. In such cases, the characteristic function (which always exists) serves as an alternative.

Common MGFs include: for the normal distribution N(μ, σ²), M(t) = exp(μt + σ²t²/2); for the exponential distribution with rate λ, M(t) = λ/(λ − t) for t < λ; and for the Poisson distribution with parameter λ, M(t) = exp(λ(et − 1)).

Variables

SymbolMeaning
MX(t)Moment generating function of random variable X
tReal-valued parameter
E[·]Expected value operator
XnThe n-th power of the random variable (n-th raw moment)
MX(n)(0)The n-th derivative of MX(t) evaluated at t = 0

Example 1

Find the mean and variance of an exponential distribution with rate λ = 2 using its MGF.

MGF: M(t) = λ/(λ − t) = 2/(2 − t)

First derivative: M'(t) = 2/(2 − t)². At t = 0: M'(0) = 2/4 = 0.5

Second derivative: M''(t) = 4/(2 − t)³. At t = 0: M''(0) = 4/8 = 0.5

Var(X) = E[X²] − (E[X])² = 0.5 − 0.25 = 0.25

Mean = 0.5, Variance = 0.25 (equivalently, σ = 0.5)

Example 2

Show that the sum of two independent Poisson variables (with parameters λ1 = 3 and λ2 = 5) is Poisson with parameter 8.

MGF of Poisson(λ): M(t) = exp(λ(et − 1))

MX+Y(t) = MX(t) · MY(t) = exp(3(et − 1)) · exp(5(et − 1))

= exp((3 + 5)(et − 1)) = exp(8(et − 1))

This is the MGF of Poisson(8), confirming the sum is Poisson with parameter λ1 + λ2 = 8.

When to Use It

Moment generating functions are used throughout probability theory and statistics.

  • Deriving the mean, variance, skewness, and kurtosis of a distribution
  • Proving the distribution of sums of independent random variables
  • Identifying an unknown distribution by matching its MGF to a known one
  • Proving limit theorems like the Central Limit Theorem
  • Obtaining tail bounds for probability distributions (Chernoff bounds)
  • Actuarial science for modeling aggregate claims in insurance

Ad Space — Bottom Banner

Embed This Calculator

Copy the code below and paste it into your website or blog.
The calculator will work directly on your page.