###### 3.6.1 Singular Random Vectors

Suppose random vector ** X** is singular with covariance matrix

**Σ**. There exists a row vector

*b ≠***0**such that

**′ = 0. Consider the random variable**

*b*Σ*b***. By [3.28],**

*bX*[3.35]

Since our random variable ** bX** has 0 variance, it must equal some constant

*a*. This argument is reversible, so we conclude that a random vector

**is singular if and only if there exists a row vector**

*X***≠**

*b***0**and a constant

*a*such that

[3.36]

Dispensing with matrix notation, this becomes

[3.37]

Since ** b** ≠

**0**, at lease one component

*b*is nonzero. Without loss of generality, assume

_{i}*b*

_{1}≠ 0. Rearranging [3.37], we obtain

[3.38]

which expresses component *X*_{1} as a linear polynomial of the other components *X _{i}*. We conclude that a random vector

**is singular if and only if one of its components is a linear polynomial of the other components. In this sense, a singular covariance matrix indicates that at least one component of a random vector is extraneous.**

*X*If one component of ** X** is a linear polynomial of the rest, then all realizations of

**must fall in a plane within**

*X**. The random vector*

^{n}**can be thought of as an**

*X**m*-dimensional random vector sitting in a plane within

*, where*

^{n}*m*<

*n*. This is illustrated with realizations of a singular two-dimensional random vector

**in Exhibit 3.6.**

*X***. Component**

*X**X*

_{2}is a linear polynomial of component

*X*

_{1}.

If a random vector ** X** is singular, but the plane it sits in is not aligned with the coordinate system of

*, we may not immediately realize that it is singular from its covariance matrix*

^{n}**Σ**. A simple test for singularity is to calculate the determinant |

**Σ**| of the covariance matrix. If this equals 0,

**is singular. Once we know that**

*X***is singular, we can apply a change of variables to eliminate extraneous components**

*X**X*and transform

_{i}**into an equivalent**

*X**m*-dimensional random vector

**,**

*Y**m*<

*n*. The change of variables will do this by transforming (rotating, shifting, etc.) the plane that realizations of

**sit in so that it aligns with the coordinate system of**

*X**. Such a change of variables is obtained with a linear polynomial of the form*

^{n}[3.39]

Consider a three-dimensional random vector ** X** with mean vector and covariance matrix

[3.40]

We note that **Σ** has determinant |**Σ**| = 0, so it is singular. We propose to transform ** X** into an equivalent two-dimensional random vector

**using a linear polynomial of the form [3.39]. For convenience, let’s find a transformation such that**

*Y***will have mean vector**

*Y***0**and covariance matrix

**:**

*I*[3.41]

We first solve for ** k**. By [3.31]

[3.42]

so we seek a factorization **Σ _{X}** =

*′. Applying the Cholesky factorization and discarding an extraneous column of 0’s, as described in Section 2.7, we obtain*

**kk**[3.43]

Solving next for ** d**, by [3.30]

[3.44]

[3.45]

[3.46]

[3.47]

Accordingly, our transformation is

[3.48]

Exhibit 3.7 illustrates how this change of variables transforms the plane in which ** X** sits so that it aligns with the coordinate system of

^{2}

**sit in so that it aligns with the coordinate system of**

*X*^{2}. The third extraneous component of

**“drops out.”**

*X*###### Exercises

Below are described four three-dimensional random vectors: ** W**,

**,**

*V***, and**

*X***. Assuming their second moments exist, which of the random vectors has a singular covariance matrix?**

*Y*- Components
*V*_{1}and*V*_{2}are independent, with*V*_{3}= 2*V*_{1}– 5*V*_{2}+ 1. - Components
*W*_{1}and*W*_{2}are independent, with*W*_{3}=*W*_{1}–*log*(*W*_{2}). - Components
*X*_{1},*X*_{2}, and*X*_{3}represent next year’s total returns for three different companies’ common stocks. - Components
*Y*_{1}and*Y*_{2}represent tomorrow’s prices for the nearby 3-month Treasury bill and 3-month Eurodollar futures. Component*Y*_{3}represents tomorrow’s price difference between those two futures.

Consider a singular random vector ** X** with mean vector and covariance matrix

[3.49]

Transform ** X** into an equivalent two-dimensional random vector

**with mean vector**

*Y***0**and covariance matrix

**:**

*I*[3.50]