Orthogonal Vectors

The standard unit vectors in
**R*** ^{n}*
are the keys to relating
algebraic notions to geometric ones. Their algebra is
particularly simple because they are mutually perpendicular (and
because each has length one). When dealing with subspaces
of

**Definition.**
A nonempty subset of nonzero vectors in
**R*** ^{n}*
is called an

**Examples**

Orthogonal sets are automatically linearly independent.

**Theorem **
Any orthogonal set of vectors is linearly independent.

To see this result, suppose that
**v**_{1},
. . .,
**v*** _{k}*
are in this orthogonal set, and there are constants

**Definition.
**
If *V* is a subspace of **R*** ^{n}*
and

Now, if {**v**_{1},
. . .,
**v*** _{k}*
} is an orthogonal basis for

**v** = *c*_{1}
**v**_{1}
+ · · · +
*c*_{k}**v**_{k}

for some choice of
coefficients. The advantage to orthogonality is that the
coefficients are easy to determine. Again, by taking the dot
product of **v*** _{j}*
with both sides of this equation we obtain

**Theorem (Representation of Vectors in terms of Orthogonal Bases)**

Let
{**v**_{1},
. . .,
**v*** _{k}*}
be an orthogonal basis for a subspace

**
Examples
**

Orthogonal bases are
computationally easier to deal with than general bases. Is it
true that every subspace of
**R*** ^{n}*
has an orthogonal basis?
The answer is yes, and the reason is that, starting with any basis, we
can construct an orthogonal one via the following algorithm. The
algorithm is derived by following the same procedure we used to find
the decomposition of a vector into one parallel to a given line and
one perpendicular to that line; now we decompose successive
vectors into ones in the span of an already constructed subspace and
one perpendicular to that subspace.

**Theorem
(Gram-Schmidt Algorithm) **
Let
{**u**_{1},
. . .,
**u*** _{k}*}
be a linearly independent subset of

set

**v*** _{j}*
=

Indeed, the set
{**v**_{1}, . . .,
**v*** _{k}*}
has the same span as
{

**v
**
_{i}**·v
**
* _{j}*
=

By induction, then, the set {**v**_{1}
,. . .,
**v*** _{k}*}
is orthogonal.

**
Example
**

It is convenient to deal with orthogonal vectors each of which has norm 1.
Such vectors are called **unit vectors**. Notice that any orthogonal set
can be transformed into one in which each vector has norm 1, by simply
dividing each vector by its norm - the orthogonality is unaffected, and
the new norms are all 1. An orthogonal set of unit vectors is
called an **orthonormal basis**, and the Gram-Schmidt
procedure and the earlier representation theorem yield
the following result.

**Theorem**

Every subspace *W*
of
**R*** ^{n}*
has an orthonormal basis.
If {