# 3.16 The Inversion Theorem

Much of this chapter has been devoted to studying linear polynomials of random vectors. Results have included:

- formulas [3.30] and [3.31] for calculating the means and covariances of linear polynomials of random vectors;
- the use of moment-generating functions to calculate moments of linear polynomials of independent random variables;
- the definition that a linear polynomial of a joint-normal random vectors is normal;
- the fact that a quadratic polynomial of a joint-normal random vector can be expressed as a linear polynomial of independent chi-squared and normal random variables;
- the central limit theorem describing certain linear polynomials of random variables as being approximately normal.

In this section, we present an inversion theorem, which is primarily of theoretical interest. We shall use it for the practical purpose of evaluating the CDF of a linear polynomial of independent random variables.

###### 3.16.1 Complex random variables

To define characteristic functions, we must extend the notion of random variables into the complex plane. Let *U*_{1} and *U*_{2} be real random variables, and let *i* = . Then

[3.212]

is a **complex random variable**. We define its expectation as

[3.213]

###### 3.16.2 Characteristic functions

Characteristic functions are similar to MGFs. We define the **characteristic function** of a random variable *X* as

[3.214]

where *w* is real and *i* = . If *X* is continuous,

[3.215]

If ** X** is a random vector with independent components

*X*, and

_{i}*Y*is a linear polynomial of

*X*[3.216]

with ** b** and

*a*a real row vector and scalar, then, analogous to [3.143] for MGFs,

[3.217]

A uniform, *U*(*a*,*b*), random variable has characteristic function

[3.218]

Characteristic functions for *N*(μ,σ^{2}) and χ^{2}(ν,δ^{2}) random variables are, respectively,

[3.219]

[3.220]

The characteristic function for a lognormal random variable is derived by Leipnik (1991). It is complicated, so we do not present it here.

###### 3.16.3 Inversion theorem

The CDF of a random variable is uniquely determined by its characteristic function. If two random variables have the same characteristic function, they have the same CDF. An **inversion theorem** provides the CDF of a random variable *X* in terms of its characteristic function:

[3.221]

###### Exercises

Determine the characteristic function for the following random variables:

*X*~*N*(1,4);*Y*= 3*Q*+*R*+ 5, where*Q*~*U*(0,1) and*R*~ c^{2}(2,1) are independent;*Z*= , where*X*_{1}~*X*_{2}~*N*(0,1) are independent.

Use the characteristic function [3.219] of the normal distribution and [3.217] to prove that, if *X*_{1} ~ and *X*_{2} ~ are independent, then

[3.222]