The inverse matrix theorem

11. The inverse matrix theorem#

Throughout this book, there are a great many different characterizations of those matrices which are invertible. In this place, we have collected them all for conveniently looking them up.

Theorem 11.1

For an \(n\times n\) matrix \(A\), the following are equivalent:

  1. \(A\) is invertible.

  2. There exists a matrix \(B\) with \(AB=I\).

  3. There exists a matrix \(B\) with \(BA=I\).

  4. The linear system \(A\vect{x}=\vect{b}\) has a unique solution for any \(\vect{b}\) in \(\R^{n}\).

  5. \(A\) is row equivalent to the identity matrix.

  6. \(A\) has linearly independent columns.

  7. \(A\) has linearly independent rows.

  8. \(A\vect{x}=\vect{0}\) only has the trivial solution.

  9. There is a decomposition \(A=E_{1}\cdots E_{k}\) where each \(E_{i}\) is an elementary matrix.

  10. Every column of \(A\) is a pivot column.

  11. The columns of \(A\) span all of \(\R^{n}\).

  12. \(\mathrm{rank}{A}=n\).

  13. \(\det(A)\neq 0\).

  14. \(0\) is not an eigenvalue of \(A\).

Proof of Theorem 11.1

For the equivalence of i. through xi., see Theorem 3.4.1 and Exercise 4.2.10. Statement xii. is part of Theorem 4.2.3. Theorem 5.3.1 says precisely that invertibility is equivalent to xiii.. For xiv., see Proposition 6.1.4.