We want to generalize our procedure of decomposing a vector into one parallel to a given line and one perpendicular to that line, by replacing the line (really a direction determined by a single vector) with any nonempty collection of vectors.
Def. The orthogonal complement of a nonempty subset S of Rn is the set of all vectors in Rn that are orthogonal to every vector in S:
S^ = {v in Rn: v.u = 0 for every u in S}.
Notice that the orthogonal complement of S is automatically a subspace of Rn, since it is easily seen to be closed under addition and scalar multiplication, and to contain 0.
Notice that the orthogonal complement of a subspace of Rn is the same as the orthogonal complement of a basis for that subspace.
Example
Notice that we can view the orthogonal complement of a finite set of vectors as defined by a linear system. The corresponding matrix is the one whose rows are given by the vectors in S, and the orthogonal complement is then clearly its null space.
For any matrix A, (RowA) ^ = NullA.
In particular, if the rows of A are a basis fora subspace W of R n which has dimension k, then the rank of A is k andthe nullity of A i> is n - k. This means that thedimension of the row space and the dimension of the space perpendicularto it automatically add up to the total dimension n.
For any subspace W of R n ,dimW + dimW ^ = n.
Example
Now we can consider splitting vectors in R n into pieces in a given subspace and pieces in the orthogonal complementof that subspace.
Orthogonal Decomposition Theorem Let W be a subspace of R n . Then for any vector v in R n ,there exists a unique vector w in W and a unique vector z in W ^ such that v = w + z. The vector w iscalled the orthogonal projection of v onto W.
Indeed, we let {w 1 ,. . . , w k }be an orthonormal basis for W, and want to find the appropri atecoefficients to set w = c 1 w 1 + . . . + c k w k . But if we set z = (v - w) then the requirement thatz be perpendicular to W, that is, (v - w). w j = 0 for each choice of j, 1 < j < k,willdetermine those coefficients. But these equations are just (v - (c 1 w 1 +. . . + c k w k )). w j = 0, which by orthogonality are v. w j - c j w j . w j = 0. Then by normality, this means that each c j = v. w j ,and the vectors w and z = (v - w) are uniquelydetermined.
In fact, the vector w is the closest to the originalvector v among all vect ors in W.
Closest Vector Property Let W be asubspace of R n and let v be a vector in R n . Then the vector closest to v among all the vectors in W isthe orthogonal projection of v onto W.
Indeed, let w denote the orthogonal projectionof v onto W. We have, for any w' in W,
v - w' = (v - w) + (w - w').
Now, (v - w) = z is orthogonal toW,and (w - w') is in W, so is orthogonal toz. By the Pythagorean Theorem, then,||v - w'|| 2 = ||v - w|| 2 + ||w - w'|| 2 > ||v - w|| 2 , and the result follows.