11. The inverse matrix theorem#
Throughout this book, there are a great many different characterizations of those matrices which are invertible. In this place, we have collected them all for conveniently looking them up.
For an \(n\times n\) matrix \(A\), the following are equivalent:
\(A\) is invertible.
There exists a matrix \(B\) with \(AB=I\).
There exists a matrix \(B\) with \(BA=I\).
The linear system \(A\vect{x}=\vect{b}\) has a unique solution for any \(\vect{b}\) in \(\R^{n}\).
\(A\) is row equivalent to the identity matrix.
\(A\) has linearly independent columns.
\(A\) has linearly independent rows.
\(A\vect{x}=\vect{0}\) only has the trivial solution.
There is a decomposition \(A=E_{1}\cdots E_{k}\) where each \(E_{i}\) is an elementary matrix.
Every column of \(A\) is a pivot column.
The columns of \(A\) span all of \(\R^{n}\).
\(\mathrm{rank}{A}=n\).
\(\det(A)\neq 0\).
\(0\) is not an eigenvalue of \(A\).
Proof of Theorem 11.1
For the equivalence of i. through xi., see Theorem 3.4.1 and Exercise 4.2.10. Statement xii. is part of Theorem 4.2.3. Theorem 5.3.1 says precisely that invertibility is equivalent to xiii.. For xiv., see Proposition 6.1.4.