3.2 Prerequisites
We assume familiarity with basic notation and concepts from probability. If E is an event, we denote its probability Pr(E). You should be familiar with random variables and random vectors. A random vector X can be thought of as an n-dimensional vector of random variables Xi all defined on the same sample space. When we present general definitions or results for random vectors, these also apply to random variables.
It is important to distinguish between a random vector X and a realization of that random vector, which we may denote x. The realization is an element of the range of the random vector.
You should be familiar with discrete and continuous distributions for random vectors. You should be comfortable working with probability functions (PFs), probability density functions (PDFs), and cumulative distribution functions (CDFs). You should be familiar with joint distributions, conditional distributions, and marginal distributions.
We may think of random vectors as being “equivalent” in several senses. We distinguish between two of these. Random vectors X and Y are equal, denoted X = Y, if they both take on the same value with probability 1. If X and Y simply have the same probability distribution, we denote this relationship X ~ Y. We also use the symbol ~ to indicate what a random variable represents, say: X ~ tomorrow’s 3-month USD Libor rate.
You should know what it means for two or more components of a random vector X to be independent. In particular, if n components Xi are independent, their joint CDF and marginal CDFs satisfy:
[3.1]

for all x1, x2, … , xn ∈ . Similarly, their joint PDF and marginal PDFs satisfy:
[3.2]

for all x1, x2, … , xn ∈ .1