In statistics and signal processing, the orthogonality principle is a necessary and sufficient condition for the optimality of a Bayesian estimator. Loosely stated, the orthogonality principle says that the error vector of the optimal estimator (in a mean square error sense) is orthogonal to any possible estimator. The orthogonality principle is most commonly stated for linear estimators, but more general formulations are possible. Since the principle is a necessary and sufficient condition for optimality, it can be used to find the minimum mean square error estimator.
Contents
Orthogonality principle for linear estimators
The orthogonality principle is most commonly used in the setting of linear estimation. In this context, let x be an unknown random vector which is to be estimated based on the observation vector y. One wishes to construct a linear estimator
If x and y have zero mean, then it suffices to require the first condition.
Example
Suppose x is a Gaussian random variable with mean m and variance
and
Solving these two linear equations for h and c results in
so that the linear minimum mean square error estimator is given by
This estimator can be interpreted as a weighted average between the noisy measurements y and the prior expected value m. If the noise variance
Finally, note that because the variables x and y are jointly Gaussian, the minimum MSE estimator is linear. Therefore, in this case, the estimator above minimizes the MSE among all estimators, not only linear estimators.
General formulation
Let
In the special case of linear estimators described above, the space
Geometrically, we can see this problem by the following simple case where
We want to find the closest approximation to the vector
More accurately, the general orthogonality principle states the following: Given a closed subspace
Stated in such a manner, this principle is simply a statement of the Hilbert projection theorem. Nevertheless, the extensive use of this result in signal processing has resulted in the name "orthogonality principle."
A solution to error minimization problems
The following is one way to find the minimum mean square error estimator by using the orthogonality principle.
We want to be able to approximate a vector
where
is the approximation of
By the orthogonality theorem, the square norm of the error vector,
Developing this equation, we obtain
If there is a finite number
Assuming the
thus providing an expression for the coefficients