###### 3.10.4 Joint-normal Distributions

Let ** X** be an

*n*-dimensional random vector with mean vector

**μ**and covariance matrix

**Σ**. Suppose the marginal distribution of each component

*X*is normal. Let

_{i}*Y*be a random variable defined as a linear polynomial

[3.121]

of ** X**. Using [3.27] and [3.28], we can calculate the mean μ

*and standard deviation σ*

_{Y}*of*

_{Y}*Y*. Knowing only that the marginal distributions of the

*X*are normal, there is little more we can say about the distribution of

_{i}*Y*. However, there is an additional condition we can impose upon

**that will cause**

*X**Y*to be normally distributed. That condition is joint-normality.

The definition of joint-normality is almost trivial. A random vector ** X** is said to be

**joint-normal**if every nontrivial linear polynomial

*Y*of

**is normal. Joint-normal distributions are sometimes called**

*X***multivariate normal**or

**multinormal**distributions.

We denote the *n*-dimensional joint-normal distribution with mean vector **μ** and covariance matrix **Σ** as *N _{n}*(

**μ**,

**Σ**). If

**Σ**is positive definite, it has PDF

[3.122]

where |**Σ**| is the determinant of **Σ**. Exhibit 3.20 illustrates a joint-normal distribution in two random variables *X*_{1} and *X*_{2}. If we define *Y* = *X*_{1} + *X*_{2}, then *Y* is normal.

Now let’s illustrate how a random vector may fail to be joint-normal despite each of its components being marginally normal.9 Let ** X** be a two-dimensional random vector with components

*X*

_{1}and

*X*

_{2}. Let

*X*

_{1}and Z be independent

*N*(0,1) random variables, and set

*X*

_{2}equal to –|

*Z*| or |

*Z*|, depending on whether

*X*

_{1}is negative or non-negative. By construction, both

*X*

_{1}and

*X*

_{2}are

*N*(0,1), but their realizations are always either both negative or both non-negative. The vector

**, whose PDF is illustrated in Exhibit 3.21, is not joint-normal. In this case, the random variable**

*X**Y*=

*X*

_{1}+

*X*

_{2}is not normal. Instead, it has the PDF illustrated in Exhibit 3.22.

**can have two components that are both marginally normal but not be joint-normal.**

*X**Y*=

*X*

_{1}+

*X*

_{2}is illustrated where

*X*

_{1}and

*X*

_{2}are components of random vector

**, whose PDF is illustrated in Exhibit 3.21. This example illustrates that a linear polynomial of normal random variables need not be normal.**

*X*A random vector is joint-normal with uncorrelated components if and only if the components are independent normal random variables.

A property of joint-normal distributions is the fact that marginal distributions and conditional distributions are either normal (if they are univariate) or joint-normal (if they are multivariate). Specifically, let ** X** ~

*N*(

_{n}**μ**,

**Σ**). Select

*k*components. Without loss of generality, suppose these are the first

*k*components

*X*

_{1},

*X*

_{2}, …

*X*. Let

_{k}

*X*_{1}be a

*k*-dimensional vector comprising these components, and let

*X*_{2}be an (

*n*–

*k*)-dimensional vector of the remaining components. These partition

**,**

*X***μ**and

**Σ**into sub-vectors and sub-matrices as follows

[3.123]

The marginal distribution of *X*_{1} is *N _{k}*(

**μ**

_{1},

**Σ**

_{1,1}) and that of

*X*_{2}is

*N*

_{n}_{–k}(

**μ**

_{2},

**Σ**

_{2,2}). If

**Σ**

_{2,2}is positive definite, the conditional distribution of

*X*_{1}given that

*X*_{2}=

*x*_{2}is

[3.124]

If ** X** ~

*N*(

_{n}**μ**,

**Σ**

**)**,

**is a constant**

*b**m*×

*n*matrix and

**is an**

*a**m*-dimensional constant vector, then

[3.125]

This generalizes property [3.94] of one-dimensional normal distributions.