Lecture 24
Orthogonal Complements

We want to generalize our procedure of decomposing a vector into one parallel to a given line and one perpendicular to that line, by replacing the line (really a direction determined by a single vector) with any nonempty collection of vectors.

Definition. The orthogonal complement of a nonempty subset S of Rn is the set of all vectors in Rn that are orthogonal to every vector in S:

S^ = {v in Rn: v·u = 0 for every u in S}.

Notice that the orthogonal complement of S is automatically a subspace of Rn, since it is easily seen to be closed under addition and scalar multiplication, and to contain 0.

Notice that the orthogonal complement of a subspace of Rn is the same as the orthogonal complement of a basis for that subspace.

Example

Notice that we can view the orthogonal complement of a finite set of vectors as defined by a linear system. The corresponding matrix is the one whose rows are given by the vectors in S, and the orthogonal complement is then clearly its null space.

For any matrix A, (Row A) ^ = Null A.

In particular, if the rows of A are a basis for a subspace W of Rn which has dimension k, then the rank of A is k and the nullity of A is n - k. This means that the dimension of the row space and the dimension of the space perpendicular to it automatically add up to the total dimension n.

For any subspace W of R n ,dimW + dimW ^ = n.

Example

Now we can consider splitting vectors in R n into pieces in a given subspace and pieces in the orthogonal complement of that subspace.

Orthogonal Decomposition Theorem Let W be a subspace of Rn. Then for any vector v in Rn, there exists a unique vector w in W, and a unique vector z in W^, such that v = w + z. The vector w is called the orthogonal projection of v onto W.

Indeed, we let {w1, . . . , wk} be an orthonormal basis for W, and we want to find the appropriate coefficients to set w = c1w1 + · · · + ckwk. But if we set z = (v - w), then the requirement thatz be perpendicular to W, that is, (v - w)·wj = 0 for each choice of j, 1 < j < k, will determine those coefficients. These equations are just (v - (c1w1 + · · · + ckwk)) ·wj = 0, which by orthogonality are v·wj - cj wj ·wj = 0. Then by normality, this means that each cj = v·wj, and the vectors w and z = (v - w) are uniquely determined.

In fact, the vector w is the closest to the original vector v, among all vectors in W.

Closest Vector Property
Let W be a subspace of R
n, and let v be a vector in Rn. Then the vector closest to v, among all the vectors in W, is the orthogonal projection of v onto W.

Indeed, let w denote the orthogonal projection of v onto W. We have, for any w' in W,

v - w' = (v - w) + (w - w').

Now, (v - w) = z is orthogonal to W, and (w - w') is in W, so is orthogonal to z. By the Pythagorean Theorem, then, ||v - w'||2 = ||v - w||2 + ||w - w'||2 > ||v - w||2, and the result follows.

Back 250 Lecture Index