 # 3.12  Moment-Generating Functions

The moment-generating function (MGF) of a random variable X is defined as:

[3.138]

for w ∈ . We call it the moment-generating function because it provides a means of calculating the moments of X. If the MGF is finite on an open interval about w = 0, then all the moments of X exist and the kth moment of X equals the kth derivative with respect to w of the MGF evaluated at w = 0. Heuristically, we motivate this result by applying the Taylor series expansion for the exponential function in definition [3.138]:

[3.139]

If MX(w) is finite on some interval about the point w = 0, it can be shown that the expectation of the sum equals the sum of the expectations

[3.140]

[3.141]

You may confirm that the kth derivative of [3.141] with respect to w evaluating at w = 0 yields the kth moment E(Xk).

Let X be a random variable and a, b . Define a new random variable Y = bX + a. By definition [3.138], the MGF for the new random variable is related to the MGF of X by

[3.142]

More generally, suppose X is an n-dimensional random vector with independent components Xi, b is an n-dimensional row vector (b1  b2  …   bn), and a . Define the random variable Y = bX + a. The MGF of Y is

[3.143]

A uniform, U(a,b), random variable has MGF

[3.144]

Those of N(μ,σ2) or χ2(ν,δ2) random variables are, respectively,

[3.145]

[3.146]

The MGF for a lognormal random variable is derived by Leipnik (1991). It is complicated, so we do not present it here.

###### Exercises
3.41 Consider a two-dimensional random vector Z, whose components are independent and both have a U(0,1) marginal PDFs. Let Y = Z1 + Z2. Use moment-generating functions to calculate E(Y 2).