ALGORITHMS
- Systems:
-
- Consistency
- General solution
- Vector spaces:
-
- Basis for:
-
- row space
- column space
- orthogonal complement
- span of a set of vectors
- eigenspace
- Equations defining a subspace (given a spanning set)
- Basis for the intersection of two subspaces
(given spanning sets for each).
- Orthonormal basis (given a basis)
-
- Matrices:
-
- rank and nullity
- LU decomposition
- Inverse of a matrix
- determinant
- eigenvalues
- characteristic polynomial
- diagonalization
- powers
- orthogonal projection
- orthogonal diagonalization
- spectral decomposition
APPLICATIONS:
-
- Polynomial interpolation
- Markov processes
- Cramer's rule
- Differential equations
- Moore graphs
- Least squares
- Conic sections
CONCEPTS
- Matrices
-
- matrix
- symmetric matrix
- transpose
- rank
- row echelon form
- reduced row echelon form
- matrix-vector product
- linear transformation
- identity matrix
- rotation
- reflection
- matrix product
- inverse
- invertibility
- elementary matrix
- determinant
- characteristic polynomial
- eigenvalue
- eigenvector
- diagonalization
- similar matrices
- orthogonal matrix
- Vectors
-
- vector
- linear combination
- span
- subspace
- linear independence
- basis
- dimension
- norm
- inner product
- orthogonality
- orthogonal projection
- orthogonal complement
THEOREMS
- A linear system has either 0, 1, or infinitely many solutions.
- A system of linear equations is solvable if and only if the data column of
the augmented matrix lies in the span of the coefficient columns.
- Row rank equals column rank; a matrix
and its transpose have the same rank.
- Any two matrices related by a series of elementary row operations
have the same row space.
- Any two matrices related by a series of elementary row operations
have the same nullspace.
- Any linear relation among the columns of the reduced row echelon form
of a matrix is valid for the columns of the original matrix.
-
The columns in a matrix containing the pivot positions for reduced
echelon form give a basis for the column space.
- The nonzero rows of any row echelon form of a matrix give a basis
for its row space.
- The ``general solution'' of a homogeneous system gives a basis for
the null space of the coefficient matrix.
- The reduced row echelon form of a matrix is unique.
- Matrix-vector multiplication preserves the vector operations.
-
Elementary row operations are performed by multiplication by
elementary matrices on the left; multiplication on the right performs
the corresponding elementary column operations.
- Every invertible matrix is a product of elementary matrices
- A product of invertible matrices is invertible
- The transpose of an invertible matrix is invertible.
- Determinant test for invertibility
- The determinant of a product is the product of the determinants.
- Effect of elementary row operations on determinants.
- The eigenvalues are the roots of the characteristic polynomial.
- The geometric multiplicity of an eigenvalue is no greater than the
algebraic multiplicity.
- Similar matrices have the same determinant, the same characteristic
polynomial, and the same eigenvalues - with the same multiplicities
(both algebraic and geometric).
- If two matrices are similar to a third, they are similar to each other.
- The span of a set of vectors gives a subspace of R^n.
- The orthogonal complement of a set of vectors gives a subspace of
R^n.
- The row space, column space, and null space of a matrix are
subspaces of R^n.
- The rank of a matrix is equal to the dimension of its row space as
well as its column space.
- The nullity of a matrix is equal to the dimension of its null space.
- The rank plus the nullity of a matrix add up to its width.
- If W is a subspace of R^n of dimension d, then the dimension of its
orthogonal complement is n-d.
- The column space of a matrix A is the image of the function
f(x)=Ax.
- Any two bases for a subspace have the same number of elements
- A linearly independent set never has more elements than a spanning
set
- A homogeneous system with more variables than equations has a
nontrivial solution
- Any set of more than n vectors in R^n is linearly dependent.
- If a set of vectors is linearly dependent, then one of them
is in the span of the others.
- Orthogonal, nonzero vectors are linearly independent
- Eigenvectors corresponding to distinct eigenvalues of a matrix are
linearly independent; and if the matrix is symmetric, they are
orthogonal.
- The transpose of an orthogonal matrix is orthogonal
- A product of two orthogonal matrices is orthogonal.
- A matrix is orthogonal if and only if multiplication by that matrix
preserves lengths of vectors; in this case, angles are also preserved.
- For any matrix A, the null space of A^TA and of A are the same.
- If the columns of a matrix A are linearly independent, then A^TA
is invertible.
- Au.v=u.A^Tv (and A^T is the only matrix related to A in this way).
- Orthogonal projection matrices are symmetric, and satisfy P^2=P.
- If a symmetric matrix P satisfies P^2=P, then P is the orthogonal
projection matrix onto its own column space, and I-P is the orthogonal
projection matrix onto its null space.
- If A is a matrix with linearly independent columns, then
the orthogonal projection onto its column space is given by
the matrix A(A^TA)^-1 A^T.
If an nxn matrix has n distinct real eigenvalues, then it is
diagonalizable.
- An nxn matrix is diagonalizable if and only if its eigenvalues are
real, and each one has algebraic multiplicity equal to its geometric
multiplicity.
- Cauchy-Schwartz
- Triangle inequality
- For any vector u and any subspace W, the closest vector in W to u is
the orthogonal projection of u to W, and the distance from u to W is
the length of their difference.
- For any subpace W and any vector u, there is a unique decomposition
of u as the sum of two vectors, one in W and one orthogonal to W.
MORE ADVANCED RESULTS
- All of the eigenvalues of a symmetric matrix are real; and the matrix
is diagonalizable by an orthogonal matrix.
Any nxn matrix has n (complex) eigenvalues, counting multiplicities.
- Any function which preserves the vector operations (scalar
multiplication and vector addition) is given by matrix multiplication;
that matrix is unique.
- Two matrices are similar if they represent the same linear
transformation, with respect to different bases.