4.3.7 Estimators for Random Vectors
So far, we have considered estimators for random variables. This discussion generalizes for random vectors. Our earlier definitions of estimator and sample estimator generalize without modification. Samples may comprise random vectors. Parameters may comprise vectors or matrices. For example, we might estimate a random vector’s mean vector and covariance matrix, both of which are non-scalar parameters. The notions of bias, standard error and MSE apply to estimators of scalar parameters, but analogous issues arise with estimators of non-scalar parameters.
Exercises
What is the difference between a sample and a realization of a sample?
Solution
Consider the data of Exhibit 4.3:

Treat the data as a realization {x[1], x[2], … , x[50]} of a sample {X[1], X[2], … , X[50]} for X ~ U(0,θ). We wish to estimate the unknown parameter θ. Consider two estimators:
[4.23]
[4.24]
- Make sure you understand both estimators. Describe in your own words why each is reasonable.
- Using the data, estimate θ based upon each of the estimators. (If you highlight the data in Exhibit 4.3, you can copy and paste it into a spreadsheet or other application.)
- In light of the fact that x[10] = 14.66, did both estimators produce reasonable estimates for the upper bound θ of the interval [0, θ]?
- Are the estimators biased or unbiased?
In this exercise, you will demonstrate that sample variance estimator [4.5] is biased but that alternative estimator [4.27], discussed in the next section, is unbiased.
- First prove a technical result that will be needed for the derivations. Prove that, given any set of numbers {x[1], x[2], … , x[m]} whose average is
:
[4.25]
- Derive a formula for the bias of estimator [4.5].
- Modify your derivation from (b) to obtain a formula for the bias of estimator [4.27], discussed in the next section.
Suppose a distribution has known mean μ (perhaps obtained by some symmetry argument) but unknown variance σ2. Calculate the bias of the following estimator for σ2:
[4.26]