# 3.7 Principal Component Analysis

With principal component analysis, we transform a random vector ** Z** with correlated components

*Z*into a random vector

_{i}**with uncorrelated components**

*D**D*. This is called an

_{i}**orthogonalization**of

**.**

*Z*Principal component analysis can be performed on any random vector ** Z** whose second moments exist, but it is most useful with multicollinear random vectors. Principal component analysis takes the plane in which realizations of a multicollinear random vector “almost” sit and realigns it with the coordinate system of

*. The components of*

^{n}**that are perpendicular to the transformed plane have small, almost trivial standard deviations. Discarding these components provides a lower-dimensional approximate representation for**

*D***. This is illustrated with realizations of a multicollinear two-dimensional random vector**

*Z***in Exhibit 3.9.**

*Z***are illustrated in the left graph. Principal component analysis transforms**

*Z***into an equivalent multicollinear random vector**

*Z***that is aligned with the coordinate system of**

*D*^{2}. Realizations of

**are shown in the middle graph. Discarding the second component**

*D**D*

_{2}of

**, transforms**

*D***into a one-dimensional approximate representation of the two-dimensional**

*D***. Realizations of this representation are shown in the right graph.**

*Z*###### 3.7.1 Example: European Currencies

Suppose today is June 30, 2000. We consider a random vector ** Z** whose components represent the simple price returns that specific European currencies will realize versus the US dollar (USD) over the upcoming trading day:

[3.56]

Exhibit 3.10 graphs 18 months of daily exchange-rate data drawn from the period immediately following the launch of the new EUR currency. In our data, the EUR weakens following its launch, and the remaining European currencies—those that did not join the EUR on January 1, 1999—weaken in sympathy. All the currencies track the EUR, but the GBP does so the least. It is less correlated with the EUR and loses value more slowly.

We assume **μ_{Z}** =

**0**.5 Based upon a time series analysis of the historical price data, we construct a covariance matrix for

**:**

*Z*[3.57]

The corresponding correlation matrix is

[3.58]

The correlations are all positive. Several exceed 0.90. The one between DKK and EUR exceeds 0.99. The smallest is a respectable 0.45 between GBP and SEK. With such pronounced interdependencies between its components, we expect ** Z** to be multicollinear, and it is. The correlation matrix has determinant |

**ρ**| = .0000045.

To define principal components of ** Z**, we calculate orthonormal6 eigenvectors

*v**of the covariance matrix*

_{i}**Σ**of

_{Z}**. We arrange these as the columns of a matrix:**

*Z*[3.59]

The eigenvectors are graphed in Exhibit 3.11. Corresponding eigenvalues λ* _{i}* are also indicated:

*v**of covariance matrix [3.57]. Corresponding eigenvalues λ*

_{i}*are also indicated.*

_{i}The eigenvectors may be thought of as “modes of fluctuation” of random vector ** Z**. We observed in our historical data a tendency for the European currencies to move together. This is reflected in the first eigenvector. It describes a broad move in all the currencies, with the GBP participating about half as much as the other currencies. The second eigenvector has the GBP moving in opposition to the NOK and SEK, with the CHF moving modestly with the GBP. The third eigenvector describes the GBP, NOK, and SEK moving together in opposition to the other currencies. The remaining eigenvectors describe other such “modes of fluctuation.”

If the eigenvectors *v** _{i}* are modes of fluctuation of

**, then**

*Z***is a random combination of those modes of fluctuation:**

*Z*[3.60]

The *D _{i}* are the principal components of

**. They are random variables that define each mode of fluctuation’s random contribution to**

*Z***. The**

*Z**D*are uncorrelated with variances equal to the eigenvalues of their corresponding eigenvectors. The vector

_{i}**of principal components has mean**

*D***μ**= 0 and covariance matrix

_{D}[3.61]

We have ordered our principal components according to their variances. From our covariance matrix **Σ _{D}**, we see that the first three principal components are more significant than the rest. The last two principal components,

*D*

_{6}and

*D*

_{7}, have variances that are less than 1% of the variance of

*D*

_{1}. Their contribution to random vector

**is trivial.**

*Z*We can approximate ** Z** by discarding from [3.60] less significant principal components. The more we discard, the simpler—and cruder!—will be our approximation. For this example, we shall be aggressive and discard the contributions of the last four principal components, approximating

**with just the first three. We define**

*Z*[3.62]

and approximate ** Z** with . Like

**, has mean vector**

*Z***0**. Its covariance matrix is obtained from [3.61] and [3.62] using [3.31]:

[3.63]

Comparing this covariance matrix with [3.57], you can judge for yourself the quality of our approximation.