7.1. Orthogonal Complements#
In this section, we will introduce the orthogonal complement of a subspace. This concept will help us define orthogonal projections easily.
Suppose \(V\) is a subspace of \(\R^{n}\). Then the orthogonal complement of \(V\) is the set
in other words, it is the set of all vectors that are orthogonal to all of \(V\).
For a vector to be in \(V^{\bot}\), it suffices that it is orthogonal to all elements in a basis of \(V\) or, slightly more general, to all elements in a spanning set of \(V\).
Prove Proposition 7.1.1.
Solution to Exercise 7.1.1
Assume the vector \(\vect{u}\) is orthogonal to every \(\vect{v}_{i}\). If \(\vect{v}_{1},...,\vect{v}_{n}\) spans \(V\), then any \(\vect{v}\) in \(V\) can be written as \(c_{1}\vect{v}_{1}+\cdots c_{n}\vect{v}_{n}\) for certain \(c_{1},...,c_{n}\) in \(\R\). But then \(\vect{u}\cdot\vect{v}=c_{1}\vect{u}\cdot\vect{v}_{1}+\cdots+c_{n}\vect{u}\cdot\vect{v}_{n}=0\), so \(\vect{u}\) is orthogonl to \(\vect{v}\).
Let us consider some simple examples.
-
Let \(V\) be the subspace spanned by a single vector \(\vect{v}\) in \(\R^{2}\) and let \(\vect{u}\) be any vector in \(\R^{2}\), say:
\[\begin{split} \vect{v}= \begin{bmatrix} 1\\ 2 \end{bmatrix} \text{ and }\vect{u}= \begin{bmatrix} a_{1}\\ a_{2} \end{bmatrix}. \end{split}\]Then \(\vect{u}\) is in \(V^{\bot}\) if and only if \(\vect{u}\ip\vect{v}=a_{1}+2a_{2}=0\). So we find that \(V^{\bot}\) is the line described by the equation \(a_{1}+2a_{2}=0\). The vector \(\vect{v}\) is a normal vector to this line.
-
Let us now consider two vectors in \(\R^{3}\), for example
\[\begin{split} \vect{v}_{1}= \begin{bmatrix} 2\\ 1\\ -2 \end{bmatrix} \quad\text{and}\quad\vect{v}_{2}= \begin{bmatrix} 4\\ 2\\ 0 \end{bmatrix}. \end{split}\]\(V^{\bot}\) now consists of those vectors \(\vect{u}\) that satisfy both \(\vect{u}\ip\vect{v}_{1}=0\) and \(\vect{u}\ip\vect{v}_{2}=0\). Solving this system of two equations in three variables, we find
\[\begin{split} V^{\bot}=\left\{ \begin{bmatrix} -t\\ 2t\\ 0 \end{bmatrix} \mid t\in\R \right\}, \end{split}\]so \(V^{\bot}\) is a line through the origin in three-dimensional space.
Both examples are illustrated in Figure 7.1.1.
In Example 7.1.1, we twice found \(V^{\bot}\) to be a subspace. This is not a coincidence, as Proposition 7.1.2 shows.
For any subspace \(V\) of \(\R^{n}\), the orthogonal complement \(V^{\bot}\) is a subspace, too. Moreover, the only vector that is both in \(V\) and in \(V^{\bot}\) is \(\vect{0}\).
Proof of Proposition 7.1.2
Since the zero vector is orthogonal to any vector, it is in \(V^{\bot}\). Suppose now that \(\vect{u}_{1}\) and \(\vect{u}_{2}\) are in \(V^{\bot}\). Then, for arbitrary \(\vect{v}\) in \(V\), \((\vect{u}_{1}+\vect{u}_{2})\ip \vect{v}=\vect{u}_{1}\ip\vect{v}+\vect{u}_{2}\ip\vect{v}=0\), so \(\vect{u}_{1}+\vect{u}_{2}\) is in \(V^{\bot}\).
Assume now that \(\vect{u}\) is in \(V^{\bot}\) and that \(c\) is any scalar. Then, again for every \(\vect{v}\) in \(V\), \((c\vect{u})\ip\vect{v}=c(\vect{u}\ip\vect{v})=0\) so \(c\vect{u}\) is in \(V^{\bot}\). This shows that \(V^{\bot}\) is a subspace.
If \(\vect{v}\) is both in \(V\) and \(V^{\bot}\), then \(\vect{v}\ip\vect{v}=0\) so \(\vect{v}=\vect{0}\).
As we have seen in Section 4.1, the column space and null space of any \(n\times m\) matrix are subspaces of \(\R^{n}\) and \(\R^{m}\), respectively. It turns out that the transposition \({}^{T}\) and the orthogonal complement \({}^{\bot}\) relate these two spaces to each other.
For any matrix \(A\) we have \(\mathrm{Col}(A^{T})^{\bot}=\mathrm{Nul}(A)\) and \(\mathrm{Col}(A)^{\bot}=\mathrm{Nul}(A^{T})\).
Proof of Proposition 7.1.3
Note that the second claim is easily derived from the first by substituting \(A^{T}\) for \(A\). Let \(\vect{r}_{1},...,\vect{r}_{n}\) be the rows of \(A\). Then \(\vect{r}_{1}^{T},...,\vect{r}_{n}^{T}\) are the columns of \(A^{T}\). For any vector \(\vect{x}\) in \(\R^{m}\), we have
Now, \(\vect{x}\) is in \(\mathrm{Nul}(A)\) precisely when \(A\vect{x}=\vect{0}\) or, in other words, when \(\vect{r}_{i}^{T}\ip\vect{x}=0\) for any \(i\). Since the set \(\left\{\vect{r}_{1}^{T},..,\vect{r}_{n}^{T}\right\}\) spans \(\mathrm{Col}(A^{T})\), this is equivalent to \(\vect{x}\) being in \(\mathrm{Col}(A^{T})^{\bot}\).
Since, for any matrix \(A\), the rows of \(A\) are the columns of \(A^{T}\), \(\mathrm{Row}(A)=\mathrm{Col}(A^{T})\). Proposition 7.1.3 then implies that \(\mathrm{Row}(A)^{\bot}=\mathrm{Nul}(A)\).
The strength of Proposition 7.1.3 lies mainly in the fact that it allows us to actually find the orthogonal complement of a given subspace.
Let \(V\) be the subspace of \(\R^{5}\) spanned by the vectors
\(V\) is, by definition, the column space of the matrix \(A=[\vect{v}_{1}\,\vect{v}_{2}\,\vect{v}_{3}]\). By Proposition 7.1.3, we can find the orthogonal complement of \(V\) by finding the null space of \(A^{T}\). By standard computations we find:
so
If \(V\) is a subspace of \(\R^{n}\), then \(\dim(V)+\dim(V^{\bot})=n\).
Proof of Proposition 7.1.4
Let \(A\) be a matrix for which the columns are a basis of \(V\). Then \(n\) is the number of rows of \(A\) which in turn is the number of columns of \(A^{T}\). By Theorem 4.2.2, \(\dim(\mathrm{Col}(A^{T}))+\dim(\mathrm{Nul}(A^{T}))\) is the number of columns of \(A^{T}\), which is the number of rows of \(A\). Using Proposition 7.1.3, this yields
Using Proposition 4.2.5 and \(\mathrm{Row}(A^{T})=\mathrm{Col}(A)\), we find therefore:
Let \(V\) be a subspace of \(\R^{n}\). For an arbitrary vector \(\vect{u}\) in \(\R^{n}\), there exist unique vectors \(\vect{u}_{V}\) and \(\vect{u}_{V^{\bot}}\) in \(V\) and \(V^{\bot}\), respectively, such that \(\vect{u}=\vect{u}_{V}+\vect{u}_{V^{\bot}}\). This is called the orthogonal decomposition of \(\vect{u}\) with respect to \(V\).
Proof of Proposition 7.1.5
Let \(\vect{v}_{1},...,\vect{v}_{k}\) be a basis for \(V\) and let \(\vect{v}_{k+1},...,\vect{v}_{n}\) be a basis for \(V^{\bot}\). We claim that the vectors \(\vect{v}_{1},...,\vect{v}_{k},\vect{v}_{k+1},...,\vect{v}_{n}\) are linearly independent. Indeed, if there were a linear combination
we would find that
is in both \(V\) and \(V^{\bot}\). By Proposition 7.1.2, \(\vect{w}=\vect{0}\). Since \(\vect{v}_{1},...,\vect{v}_{k}\) and \(\vect{v}_{k+1},...,\vect{v}_{n}\) are bases, it follows that all \(c_{i}\) are \(0\).
Since \(\vect{v}_{1},...,\vect{v}_{k},\vect{v}_{k+1},...,\vect{v}_{n}\) is a linearly independent set of \(n\) vectors in \(n\)-dimensional space, it must be a basis. Consequently, every vector \(\vect{u}\) in \(\R^{n}\) can be written in a unique way as a linear combination
Putting \(\vect{u}_{V}=c_{1}\vect{v}_{1}+\cdots+c_{k}\vect{v}_{k}\) and \(\vect{u}_{V^{\bot}}=c_{k+1}\vect{v}_{k+1}+\cdots +c_{n}\vect{v}_{n}\) finishes the proof.
Consider the vectors
and let \(V\) be the subspace of \(\R^{3}\) spanned by \(\vect{v}_{1}\) and \(\vect{v}_{2}\). Put
It is easy to check that, as the notation suggests, \(\vect{u}_{V}\) is in \(V\) (since \(\vect{u}_{V}=-\vect{v}_{1}+2\vect{v}_{2}\)) and \(\vect{u}_{V^{\bot}}\) is in \(V^{\bot}\) (since \(\vect{u}_{V^{\bot}}\ip\vect{v}_{1}=0=\vect{u}_{V^{\bot}}\ip\vect{v}_{2}\)). So \(\vect{u}=\vect{u}_{V}+\vect{u}_{V^{\bot}}\) is the orthogonal decomposition of \(\vect{u}\) with respect to \(V\). How we can compute such a decomposition will be shown in Section 7.2.
7.1.1. Grasple Exercises#
Finding vectors orthogonal to two given vectors in \(\R^4\).
Show/Hide Content
Find the orthogonal complement of a vector \(\vect{u}\) in \(\R^3\) w.r.t. span\(\{\vect{v}_1,\vect{v}_2\}\).
Show/Hide Content
Find a geometric description of \(V^{\bot}\).
Show/Hide Content
Find a basis for the orthogonal complement of span\(\{\vect{v}_1,\vect{v}_2\}\) in \(\R^4\).
Show/Hide Content
Find a basis for the orthogonal complement of the column space of a matrix.