Linear Independence

2.5. Linear Independence#

As we have seen (Example 2.2.4 and Example 2.2.5), the multiples of a non-zero vector form a line. We have also seen there that, if we consider the set of all vectors of the form \(c_{1}\mathbf{v}_{1}+c_{2}\mathbf{v}_{2}\), for some vectors \(\mathbf{v}_{1},\mathbf{v}_{2}\) and constants \(c_{1},c_{2}\), we usually get a plane. But sometimes we don’t! For example, if \(d\mathbf{v}_{1}=\mathbf{v}_{2}\) for some constant \(d\), then all vectors of the given form can be rewritten as \((c_{1}+c_{2}d)\mathbf{v}_{1}\), so they are all contained in the line through the origin and in the direction of \(\mathbf{v}_{1}\). Every vector we can make as a linear combination of \(\mathbf{v}_{1}\) and \(\mathbf{v}_{2}\) can also be made with \(\mathbf{v}_{1}\) alone. The vector \(\mathbf{v}_{2}\) is superfluous. This situation can be seen in Figure 2.5.1.

../_images/Fig-LinInd-Examplein1D.svg

Fig. 2.5.1 The set \(\left\lbrace\mathbf{v}_{1},\mathbf{v}_{2}\right\rbrace\) contains two vectors, but one of them is superfluous. Every vector one can make as a linear combination of \(\mathbf{v}_{1}\) and \(\mathbf{v}_{2}\) can also be made with just \(\mathbf{v}_{1}\).#

We will now formalise this concept of superfluous vectors.

Definition 2.5.1

We will call a set \(S\) of vectors linearly dependent if there is some \(\mathbf{v}\) in \(S\) such that \(\Span{S}=\Span{S\setminus\left\lbrace\mathbf{v}\right\rbrace}\). In this case, we say that \(\mathbf{v}\) is linearly dependent on \(S\setminus\left\lbrace\mathbf{v}\right\rbrace\). If \(S\) is not linearly dependent, we say \(S\) is linearly independent.

In other words, a set \(S\) is linearly dependent if it contains at least one superfluous vector. It may very well contain infinitely many. Let us briefly investigate linear dependence for very small sets before we get to more substantial examples.

Proposition 2.5.1

Let \(S\) be a subset of \(\mathbb{R}^{n}\) containing

  • precisely one vector, say \(\mathbf{v}\). Then \(S\) is linearly dependent precisely when \(\mathbf{v}=\mathbf{0}\).

  • precisely two vectors, say \(\mathbf{u}\) and \(\mathbf{v}\). Then \(S\) is linearly independent unless one of these vectors is a multiple of the other.

Proof of Proposition 2.5.1

  • Assume \(S=\left\lbrace\mathbf{v}\right\rbrace\). The span of \(S\setminus\left\lbrace\mathbf{v}\right\rbrace\) is the span of the empty set, which is precisely \(\left\lbrace\mathbf{0}\right\rbrace\). This is equal to \(\Span{S}\) if and only if \(\mathbf{v}=\mathbf{0}\).

  • If \(\Span{S}=\Span{\mathbf{v}}\) then \(\mathbf{u}\) is in \(\Span{\mathbf{v}}\) so it is a multiple of \(\mathbf{v}\). Similarly, if \(\Span{S}=\Span{\mathbf{u}}\) then \(\mathbf{v}\) is in \(\Span{\mathbf{u}}\) so it is a multiple of \(\mathbf{u}\).

As you can see from the proof of Proposition 2.5.1, our definition of linear depence, while intuitive, is a bit hard to work with. In Proposition 2.5.3/Corollary 2.5.1, we will see a more convenient way to determine whether a given set of vectors is linearly dependent or not. But let us first consider some examples.

Example 2.5.1

  1. Consider the vectors

    \[\begin{split} \mathbf{v}_{1}= \begin{bmatrix}1\\0\end{bmatrix}\quad\mathbf{v}_{2}= \begin{bmatrix}0\\1\end{bmatrix} \quad\mathbf{v}_{3}= \begin{bmatrix}1\\1\end{bmatrix}, \end{split}\]

    which are shown on the left in Figure 2.5.2. The set \(S=\left\lbrace\mathbf{v}_{1},\mathbf{v}_{2},\mathbf{v}_{3}\right\rbrace\) is linearly dependent in view of the following equalities:

    (2.5.1)#\[ \mathbf{v}_{1}=-\mathbf{v}_{2}+\mathbf{v}_{3},\]
    (2.5.2)#\[ \mathbf{v}_{2}=-\mathbf{v}_{1}+\mathbf{v}_{3},\]
    (2.5.3)#\[ \mathbf{v}_{3}=\mathbf{v}_{1}+\mathbf{v}_{2}.\]

    Indeed, if we take an arbitrary vector \(\mathbf{v}\) in \(\Span{S}\), we can write it as

    \[\begin{align*} \mathbf{v}&=c_{1}\mathbf{v}_{1}+c_{2}\mathbf{v}_{2}+c_{3}\mathbf{v}_{3}\\ &=(c_{2}-c_{1})\mathbf{v}_{2}+(c_{3}+c_{1})\mathbf{v}_{3} \end{align*}\]

    in view of equation (2.5.1). This means that \(\mathbf{v}\) is also in \(\Span{S\setminus\left\lbrace\mathbf{v}_{1}\right\rbrace}\) and consequently that \(\mathbf{v}_{1}\) is linearly dependent on \(\mathbf{v}_{2}\) and \(\mathbf{v}_{3}\). Similarly, equation (2.5.2) shows that \(\mathbf{v}_{2}\) is linearly dependent on \(\mathbf{v}_{1}\) and \(\mathbf{v}_{3}\) and equation (2.5.3) shows that \(\mathbf{v}_{3}\) is linearly dependent on \(\mathbf{v}_{1}\) and \(\mathbf{v}_{2}\) .

    However, every subset of \(S\) containing precisely two vectors will be linearly independent, as \(S\) contains no two vectors that are multiples of each other.

  2. Consider now the vectors

    \[\begin{split} \mathbf{v}_{1}= \begin{bmatrix}1\\0\end{bmatrix}\quad \mathbf{v}_{2}= \begin{bmatrix}0\\1\end{bmatrix}\quad\mathbf{v}_{4}= \begin{bmatrix}2\\0\end{bmatrix} \end{split}\]

    which are shown on the right in Figure 2.5.2. The set \(S=\left\lbrace\mathbf{v}_{1},\mathbf{v}_{2},\mathbf{v}_{4}\right\rbrace\) is again linearly dependent since

    \[ \mathbf{v}_{4}=2\mathbf{v}_{1}+0\mathbf{v}_{2}\nonumber \]

    but now the subset \(\left\lbrace\mathbf{v}_{1},\mathbf{v}_{4}\right\rbrace\) is a linearly dependent subset of \(S\). On the other hand, the subsets \(\left\lbrace\mathbf{v}_{1},\mathbf{v}_{2}\right\rbrace\) and \(\left\lbrace\mathbf{v}_{2},\mathbf{v}_{4}\right\rbrace\) are linearly independent.

    ../_images/Fig-LinInd-Examplein2D.svg

    Fig. 2.5.2 The vectors from i. on the left and from ii. on the right. On the left, there is no vector which is a multiple of another vector, so every set of two vectors is linearly independent. On the right this is not the case. The vectors \(\mathbf{v}_{1}\) and \(\mathbf{v}_{4}\) are multiples of each other and therefore \(\left\lbrace\mathbf{v}_{1},\mathbf{v}_{4}\right\rbrace\) is linearly dependent.#

  3. Put

    \[\begin{split} \mathbf{w}_{1}= \begin{bmatrix}1\\0\\0\end{bmatrix},\quad\mathbf{w}_{2}= \begin{bmatrix}0\\1\\0\end{bmatrix},\quad\mathbf{w}_{3}= \begin{bmatrix}1\\2\\0\end{bmatrix},\quad \text{and}\quad\mathbf{w}_{4}= \begin{bmatrix}1\\2\\1\end{bmatrix}. \end{split}\]

    The set \(\left\lbrace\mathbf{w}_{1},\mathbf{w}_{2},\mathbf{w}_{3}\right\rbrace\) is linearly dependent. The set \(\left\lbrace\mathbf{w}_{1},\mathbf{w}_{2},\mathbf{w}_{4}\right\rbrace\), however, is not. This is illustrated in Figure 2.5.3.

    ../_images/Fig-LinInd-Examplein3D.svg

    Fig. 2.5.3 The four vectors from iii.. Note that \(\mathbf{w}_{3}\) lies in the plane spanned by \(\mathbf{w}_{1}\) and \(\mathbf{w}_{2}\) but \(\mathbf{w}_{4}\) does not. This means that \(\left\lbrace\mathbf{w}_{1},\mathbf{w}_{2},\mathbf{w}_{3}\right\rbrace\) is linearly dependent but \(\left\lbrace\mathbf{w}_{1},\mathbf{w}_{2},\mathbf{w}_{4}\right\rbrace\) is not.#

Grasple Exercise 2.5.1

https://embed.grasple.com/exercises/e7ff6fad-218f-4583-907f-514b3980698a?id=70195

To verify whether a set \(\{\vect{u}, \vect{v}\}\) is linearly independent.

Grasple Exercise 2.5.2

https://embed.grasple.com/exercises/96efb1e1-8994-4067-88b2-a24fb58c63cb?id=70196

To verify whether a set \(\{\vect{u}, \vect{v}\}\) is linearly independent.

Grasple Exercise 2.5.3

https://embed.grasple.com/exercises/956e2076-9232-4b43-aad9-ddbbf8252a71?id=70197

To verify whether a set \(\{\vect{u}, \vect{v}\}\) is linearly independent.

The following elementary properties of linear dependence and linear independence will be used throughout the text, often tacitly.

Proposition 2.5.2

Let \(S\) be a subset of \(\mathbb{R}^{n}\).

  • If \(S\) contains \(\mathbf{0}\), then it is a linearly dependent set.

  • If \(S\) is linearly dependent and \(S\subseteq T\), then \(T\) is linearly dependent.

  • If \(T\) is linearly independent and \(S\subseteq T\), then \(S\) is linearly independent.

We leave the verifications of these statements to the reader.

Exercise 2.5.1

Prove Proposition 2.5.2.

But how do you determine whether a set of vectors is linearly independent or not? Like so many problems in linear algebra, it comes down to solving a system of linear equations, as Proposition 2.5.3 shows.

Proposition 2.5.3

A set \(\left\lbrace\mathbf{v}_{1},...,\mathbf{v}_{k}\right\rbrace\) of vectors in \(\mathbb{R}^{n}\) is linearly dependent if and only if the vector equation

(2.5.4)#\[c_{1}\mathbf{v}_{1}+\cdots +c_{k}\mathbf{v}_{k}=\mathbf{0}\]

has a non-trivial solution. That is, a solution where not all \(c_i\) are equal to \(0\).

Proof of Proposition 2.5.3

If \(\left\lbrace\mathbf{v}_{1},...,\mathbf{v}_{k}\right\rbrace\) is linearly dependent, one of these vectors, say \(\mathbf{v}_{i}\), is linearly dependent on the others, i.e. it is in \(\Span{\mathbf{v}_{1},...,\mathbf{v}_{i-1},\mathbf{v}_{i+1},...\mathbf{v}_{k}}\). Therefore, there exist some scalars \(c_{1},...,c_{i-1},c_{i+1},...,c_{k}\) such that

\[ \mathbf{v}_{i}=c_{1}\mathbf{v}_{1}+\cdots +c_{i-1}\mathbf{v}_{i-1}+c_{i+1}\mathbf{v}_{i+1}+\cdots +c_{k}\mathbf{v}_{k} \]

or equivalently

\[ 0=c_{1}\mathbf{v}_{1}+\cdots +c_{i-1}\mathbf{v}_{i-1}-\mathbf{v}_{i}+c_{i+1}\mathbf{v}_{i+1}+\cdots +c_{k}\mathbf{v}_{k}. \]

This means that \((c_{1},...,c_{i-1},-1,c_{i+1},...,c_{k})\) is a solution of the equation (2.5.4). It is a non-trivial one since the \(i\)-th coefficient is \(-1\) which is non-zero.

If (2.5.4) has a non-trivial solution then there are \(c_{1},...,c_{k}\), not all \(0\), such that \(c_{1}\mathbf{v}_{1}+\cdots +c_{k}\mathbf{v}_{k}=\mathbf{0}\). Take any \(i\) such that \(c_{i}\neq0\). Then

\[\mathbf{v}_{i}=\frac{c_{1}}{c_{i}}\mathbf{v}_{1}-\cdots -\frac{c_{i-1}}{c_{i}}\mathbf{v}_{i-1}-\frac{c_{i+1}}{c_{i}}\mathbf{v}_{i}-\cdots -\frac{c_{k}}{c_{i}}\mathbf{v}_{k}. \]

This implies \(\mathbf{v}_{i}\) is in \(\Span{\mathbf{v}_{1},...,\mathbf{v}_{i-1},\mathbf{v}_{i+1},...,\mathbf{v}_{k}}\) so \(\left\lbrace\mathbf{v}_{1},...,\mathbf{v}_{k}\right\rbrace\) is linearly dependent.

Corollary 2.5.1

A set \(\left\lbrace\mathbf{v}_{1},\dots,\mathbf{v}_{k}\right\rbrace\) of vectors in \(\mathbb{R}^{n}\) is linearly dependent if and only if the matrix equation \(A\mathbf{x}=\mathbf{0}\) with \(A=\begin{bmatrix}\mathbf{v}_{1}& \cdots &\mathbf{v}_{k}\end{bmatrix}\) has a non-trivial solution, i.e. if \(A\) has a column without a pivot.

Again we leave the verification to the diligent reader.

Exercise 2.5.2

Prove Corollary 2.5.1

Example 2.5.2

Consider the following three vectors in \(\mathbb{R}^{4}\):

\[\begin{split} \mathbf{v}_{1}= \begin{bmatrix} 1\\1\\0\\-2 \end{bmatrix}\quad\mathbf{v}_{2}= \begin{bmatrix}-1\\2\\3\\-2\end{bmatrix}\quad\mathbf{v}_{3}= \begin{bmatrix} 4\\1\\-3\\-4\end{bmatrix}. \end{split}\]

Do these vectors form a linearly dependent set? How do we find out? Well, we use the vectors as the columns of a matrix \(A\) and compute an echelon form using standard techniques

\[\begin{split}A= \begin{bmatrix}1&-1&4\\1&2&1\\0&3&-3\\-2&-2&-4 \end{bmatrix}\sim\cdots\sim \begin{bmatrix} 1&-1&4\\0&3&-3\\0&0&0\\0&0&0\end{bmatrix}. \end{split}\]

The third column has no pivot, so the system \(A\mathbf{x}=\mathbf{0}\) has infinitely many solutions. In particular, it therefore has a non-trivial one. Consequently, the set \(\left\lbrace\mathbf{v}_{1},\mathbf{v}_{2},\mathbf{v}_{3}\right\rbrace\) is linearly dependent.

From the reduced echelon form, we can easily find a way to write \(\mathbf{v}_{3}\) as a linear combination of \(\mathbf{v}_{1}\) and \(\mathbf{v}_{2}\). In this case, the reduced echelon form is

\[\begin{split}A= \begin{bmatrix}1&-1&4\\1&2&1\\0&3&-3\\-2&-2&-4 \end{bmatrix}\sim\cdots\sim \begin{bmatrix} 1&0&3\\0&1&-1\\0&0&0\\0&0&0\end{bmatrix}. \end{split}\]

If we put the free variable \(x_{3}\) equal to 1, we find \(x_{1}=-3\) and \(x_{2}=1\), which gives:

\[-3\mathbf{v}_{1}+\mathbf{v}_{2}+\mathbf{v}_{3}=\mathbf{0}\quad\text{hence}\quad \mathbf{v}_{3}=3\mathbf{v}_{1}-\mathbf{v}_{2}. \]

Grasple Exercise 2.5.4

https://embed.grasple.com/exercises/4d23327b-93df-41b8-bc55-481e82ba28c0?id=70201

To verify whether a set \(\{\vect{a}_1, \vect{a}_2,\vect{a}_3 \}\) is linearly independent.

Grasple Exercise 2.5.5

https://embed.grasple.com/exercises/cf7c40ff-3d18-4a98-9bc5-20a4ea263bb0?id=87417

To verify whether a set \(\{\vect{a}_1, \vect{a}_2,\vect{a}_3 \}\) is linearly independent.

There now follow a couple of statements which can be helpful in determining whether a given set of vectors is linearly dependent or not. The first one tells us that an ordered set is linearly dependent precisely when there is a vector which depends on the preceding vectors.

Theorem 2.5.1

An ordered set \(S=(\mathbf{v}_{1},...,\mathbf{v}_{n})\) is linearly dependent if and only if there is a \(k\) such that \(\mathbf{v}_{k}\) is a linear combination of \(\mathbf{v}_{1},...,\mathbf{v}_{k-1}\).

Proof of Theorem 2.5.1

Let us assume \(\mathbf{v}_{k}=c_{1}\mathbf{v}_{1}+\cdots+c_{k-1}\mathbf{v}_{k-1}\) for some scalars \(c_{1},...,c_{k-1}\). An arbitrary element \(\mathbf{v}\) of \(\Span{S}\) is a linear combination of \(\mathbf{v}_{1},...,\mathbf{v}_{n}\), so it is

\[\mathbf{v}=d_{1}\mathbf{v}_{1}+\cdots+d_{k-1}\mathbf{v}_{k-1}+d_{k}\mathbf{v}_{k}+d_{k+1}\mathbf{v}_{k+1}+\cdots+ d_{n}\mathbf{v}_{n} \]

for certain scalars \(d_{1},...,d_{n}\). We can now rewrite \(\mathbf{v}\) as

\[\mathbf{v}=d_{1}\mathbf{v}_{1}+\cdots+d_{k-1}\mathbf{v}_{k-1}+d_{k}(c_{1}\mathbf{v}_{1}+\cdots+c_{k-1}\mathbf{v}_{k-1})+d_{k+1}\mathbf{v}_{k+1}+\cdots+ d_{n}\mathbf{v}_{n} \]

so \(\mathbf{v}\) is in \(\Span{S\setminus\left\lbrace\mathbf{v}_{k}\right\rbrace}\).

Suppose now that \(S\) is linearly dependent. Let \(k\) be maximal such that \(\Span{S}=\Span{S\setminus\left\lbrace\mathbf{v}_{k}\right\rbrace}\). Since \(\mathbf{v}_{k}\) is in \(S\), it is in \(\Span{S\setminus\left\lbrace\mathbf{v}_{k}\right\rbrace}\). So there exist scalars \(c_{1},..,c_{k-1},c_{k+1},...,c_{n}\) such that

(2.5.5)#\[\mathbf{v}_{k}=c_{1}\mathbf{v}_{1}+\cdots+c_{k-1}\mathbf{v}_{k-1}+c_{k+1}\mathbf{v}_{k+1}+\cdots+ c_{n}\mathbf{v}\_{n}.\]

If we can show that \(c_{k+1}=...=c_{n}=0\) we are done, because then we have written \(\mathbf{v}_{k}\) as a linear combination of \(\mathbf{v}_{1},...,\mathbf{v}_{k-1}\). We will prove by contraposition that this is impossible. Assume \(c_{j}\neq0\) for some \(j\) greater than \(k\). Then Equation (2.5.5) yields

\[\mathbf{v}_{j}=\frac{1}{c_{j}}(c_{1}\mathbf{v}_{1}-\cdots -c_{k-1}\mathbf{v}_{k-1}+\mathbf{v}_{k}-c_{k+1}\mathbf{v}_{k+1}-\cdots-c_{j-1}\mathbf{v}_{j-1}-c_{j+1}\mathbf{v}_{j+1}-\cdots -c_{n}\mathbf{v}_{n}). \]

Consequently, any linear combination of \(S\) can be rewritten as a linear combination of \(\mathbf{v}_{1},...,\mathbf{v}_{j-1},\mathbf{v}_{j+1},...,\mathbf{v}_{n}\), i.e. \(\Span{S}=\Span{S\setminus\left\lbrace\mathbf{v}_{j}\right\rbrace}\). But \(j\) is larger than \(k\) and we have assumed \(k\) to be maximal with this property! This is impossible, so \(c_{j}=0\) for all \(j\) greater than \(k\).

Theorem 2.5.1 together with Corollary 2.5.1 implies that the columns of a matrix are linearly dependent if and only if there is some column which is linearly dependent on the preceding ones. This observation will allow us to simplify certain arguments. The following result essentially claims that if \(k\) vectors suffice to span a set \(S\) and you have more than \(k\) vectors in \(S\), then at least one of them must be superfluous.

Theorem 2.5.2

Suppose \(\mathbf{u}_{1},...,\mathbf{u}_{k}\) and \(\mathbf{v}_{1},...,\mathbf{v}_{l}\) are all vectors in \(\mathbb{R}^{n}\). If \(k<l\) and \(\Span{\mathbf{u}_{1},...,\mathbf{u}_{k}}\) contains \(\Span{\mathbf{v}_{1},...,\mathbf{v}_{l}}\) then the set \(\left\lbrace\mathbf{v}_{1},...,\mathbf{v}_{l}\right\rbrace\) is linearly dependent.

Proof of Theorem 2.5.2

Consider the matrices

\[A=\left[\vect{u}_{1}\cdots\mathbf{u}_{k}\right],\quad B=\left[\vect{v}_{1}\cdots\mathbf{v}_{l}\right]\quad \text{and}\quad C=[A\backslash B]. \]

Bringing \(C\) in echelon form gives

\[C\sim D=[E\backslash F] \]

where \(D\) is the echelon form of \(C\), \(E\) is an echelon form of \(A\), and \(F\) is equivalent to \(B\).

We claim that all of the pivot positions of \(D\) are in \(E\). Indeed, suppose that the \(i\)-th column of \(F\), let’s call it \(f_{i}\), contains a pivot. Then \(E\mathbf{x}=\mathbf{f}_{i}\) is inconsistent and therefore \(A\mathbf{x}=\mathbf{v}_{i}\) is also inconsistent since the elementary row operations preserve linear combinations. But this implies that \(\mathbf{v}_{i}\) is not a linear combination of \(\mathbf{u}_{1},...,\mathbf{u}_{k}\) hence it is not in \(\Span{\mathbf{u}_{1},...,\mathbf{u}_{k}}\). This is a contradiction.

Since \(F\) contains no pivot positions of \(D\), it has at least as many zero rows as \(E\). This implies that an echelon form \(G\) of \(B\), which is necessarily also an echelon form of \(F\), must also have at least as many zero rows as \(E\). Therefore, \(G\) has no more pivots than \(E\). Since \(E\) has at most \(k\) pivots and \(k<l\), \(G\) must have a column without pivot. So \(B\mathbf{x}=\mathbf{0}\) has a non-trivial solution and by Corollary 2.5.1 the set \(\left\lbrace\mathbf{v}_{1},...,\mathbf{v}_{l}\right\rbrace\) must be linearly dependent.

Corollary 2.5.2

Let \(S\) be a subset of \(\mathbb{R}^{n}\). If there are more than \(n\) vectors in \(S\), then \(S\) is linearly dependent.

Proof of Corollary 2.5.2

Take distinct vectors \(\mathbf{v}_{1},...,\mathbf{v}_{n+1}\) in \(S\). \(\Span{\mathbf{v}_{1},...,\mathbf{v}_{n+1}}\) is contained in \(\Span{\mathbf{e}_{1},..,\mathbf{e}_{n}}\) and \(n+1>n\), so \(\left\lbrace\mathbf{v}_{1},..,\mathbf{v}_{n+1}\right\rbrace\) is linearly dependent by Theorem 2.5.2. Since this set is contained in \(S\), \(S\) must be linearly dependent, too, by Proposition 2.5.2.

Example 2.5.3

To illustrate the strength of Corollary 2.5.2, consider the following set of vectors in \(\mathbb{R}^{5}\):

\[\begin{split}\left\lbrace \begin{bmatrix}5\\-2\\3\\1\\0\end{bmatrix}, \begin{bmatrix}-47\\8\\12\\-3\\4\end{bmatrix}, \begin{bmatrix}12\\-3\\-2\\-1\\11\end{bmatrix}, \begin{bmatrix}42\\-7\\-52\\2\\16\end{bmatrix}, \begin{bmatrix}87\\56\\-32\\1\\0\end{bmatrix}, \begin{bmatrix}-48\\2\\35\\156\\8\end{bmatrix}\right\rbrace. \end{split}\]

If we had to bring the matrix with these six vectors as columns to echelon form, we would have our work cut out for us! Fortunately, we can just remark that there are six vectors with five entries each. Since \(6>5\), Corollary 2.5.2 guarantees that this set is linearly dependent.

Example 2.5.4 (Application)

Often, on news sites or in newspapers, you might see the standings of a football tournament displayed in a large table, as in Table 2.5.1. Quite a lot of the information in such a table is redundant because some of the columns are linearly dependent.

Table 2.5.1 The final standings of the first season of the Eredivisie football played in 1956-57, as shown on Wikipedia on Wednesday, March 23rd 2022.#

Team

Pld

W

D

L

GF

GA

GD

Pts

1

AFC Ajax

34

22

5

7

64

40

24

49

2

Fortuna ‘54

34

20

5

9

76

48

28

45

3

SC Enschede

34

15

11

8

81

47

34

41

4

MVV Maastricht

34

15

10

9

53

42

11

40

5

PSV Eindhoven

34

18

3

13

93

71

22

39

6

Feijenoord

34

15

9

10

79

58

21

39

7

VVV ‘03

34

16

6

12

50

53

-3

38

8

Sparta Rotterdam

34

12

12

10

66

59

7

36

9

NAC

34

14

8

12

59

61

-2

36

10

DOS

34

17

1

16

79

75

4

35

11

Rapid JC

34

13

7

14

64

63

1

33

12

NOAD

34

12

7

15

54

64

-10

31

13

BVC Amsterdam

34

11

8

15

49

67

-18

30

14

GVAV

34

9

10

15

52

66

-14

28

15

BVV

34

11

4

19

70

76

-6

26

16

Elinkwijk

34

10

4

20

52

87

-35

24

17

Willem II

34

8

6

20

59

79

-20

22

18

FC Eindhoven

34

8

4

22

39

83

-44

20

Each of the eight columns on the right can be interpreted as a vector with one entry per team, so a vector in \(\mathbb{R}^{18}\). Using the column headers from Table 2.5.1 as the names of these vectors, we find:

\[\mathbf{Pts}=2\mathbf{W}+1\mathbf{D}+0\mathbf{L} \]

since a win yielded 2 points, a draw 1 point, and a loss 0 points. This means that \(\left\lbrace\mathbf{W},\mathbf{D},\mathbf{L},\mathbf{Pts}\right\rbrace\) is a linearly dependent subset of \(\mathbb{R}^{18}\). In fact, the smaller set \(\left\lbrace\mathbf{W},\mathbf{D},\mathbf{Pts}\right\rbrace\) is already linearly dependent. Similarly, the column \(\mathbf{GD}\), which gives the goal difference for each team, can be obtained by subtracting the column \(\mathbf{GA}\), which gives the goals conceded, from \(\mathbf{GF}\), which gives the goals scored.

2.5.1. Grasple Exercises#

Grasple Exercise 2.5.6

https://embed.grasple.com/exercises/d40a884f-c356-4ba7-a777-92fd5f4fffcd?id=70202

To verify whether a set \(\{\vect{a}_1, \vect{a}_2,\vect{a}_3 \}\) (in \(\mathbb{R}^3\)) is linearly independent.

Grasple Exercise 2.5.7

https://embed.grasple.com/exercises/d40a884f-c356-4ba7-a777-92fd5f4fffcd?id=70202

Finding a parameter \(h\) such that a set \(\{\vect{a}_1, \vect{a}_2,\vect{b}\}\) is linearly independent.

Grasple Exercise 2.5.8

https://embed.grasple.com/exercises/7341c27a-5482-4303-b20f-4a3965c99535?id=70209

Like the previous question

Grasple Exercise 2.5.9

https://embed.grasple.com/exercises/018f7ed5-d1ac-490b-add4-40568f525878?id=70213

Like the previous question

Grasple Exercise 2.5.10

https://embed.grasple.com/exercises/c42855ae-f0af-4b52-9a89-fab0e1bdf877?id=87321

To verify whether three vectors in \(\mathbb{R}^4\) are linearly independent.

Grasple Exercise 2.5.11

https://embed.grasple.com/exercises/d78332f6-0a0b-404a-973d-adc745782ab6?id=70204

To verify whether the columns of a matrix \(A\) are linearly independent.

Grasple Exercise 2.5.12

https://embed.grasple.com/exercises/9345f478-7f65-4239-a9ea-26929131f010?id=70205

Like the previous question.

Grasple Exercise 2.5.13

https://embed.grasple.com/exercises/9345f478-7f65-4239-a9ea-26929131f010?id=70205

Verifying linear (in)dependence of a set of vectors.

Grasple Exercise 2.5.14

https://embed.grasple.com/exercises/fce5512f-2e50-4973-8de1-d8d569e497b4?id=70208

Verifying linear (in)dependence of a set of vectors once more

Grasple Exercise 2.5.15

https://embed.grasple.com/exercises/fce5512f-2e50-4973-8de1-d8d569e497b4?id=70208

For which types of objects is linear independence defined?

Grasple Exercise 2.5.16

https://embed.grasple.com/exercises/5cb7db99-e91c-4f82-94ce-e69c042e14af?id=70191

Getting straight the definition of linear independence.

Grasple Exercise 2.5.17

https://embed.grasple.com/exercises/4da2f0e7-eef3-4acc-baea-ac689bda49f3?id=87426

Can … . . be linearly independent?

Grasple Exercise 2.5.18

https://embed.grasple.com/exercises/cd594e68-f4b5-41fa-839e-139dd5a2c428?id=70198

True/False question about linear (in)dependence.

Grasple Exercise 2.5.19

https://embed.grasple.com/exercises/221b7ec2-e749-4528-9be9-ee6138e2f13d?id=70199

True/False question about linear (in)dependence.

Grasple Exercise 2.5.20

https://embed.grasple.com/exercises/c277e508-cced-46f8-911f-4fb0dce4bd18?id=70200

True/False question about linear (in)dependence.

Grasple Exercise 2.5.21

https://embed.grasple.com/exercises/72bf1389-22c8-4588-8f73-4c7215b7cea4?id=70217

About the connection between pivots and linearly (in)dependent columns.

Grasple Exercise 2.5.22

https://embed.grasple.com/exercises/85cbd7f9-9ab3-4e4a-9c19-152453ce0c52?id=68883

About the connection between pivots and linearly (in)dependent columns.

Grasple Exercise 2.5.23

https://embed.grasple.com/exercises/b7fa29e0-5d17-4a74-b173-0219f69fb2a3?id=68884

One more about pivots and linearly (in)dependent columns.

Grasple Exercise 2.5.24

https://embed.grasple.com/exercises/804d59bd-3813-484b-b0b9-f79a1d6921c2?id=87427

True/False question about linear (in)dependence.

Grasple Exercise 2.5.25

https://embed.grasple.com/exercises/5f21724d-6e37-456f-b661-3a7bfd83fb39?id=70219

True/False question about linear (in)dependence.

Grasple Exercise 2.5.26

https://embed.grasple.com/exercises/bcd13c4c-50e6-4c8c-be50-abd4ba71ef2f?id=70221

True/False question about linear (in)dependence.

Grasple Exercise 2.5.27

https://embed.grasple.com/exercises/274a9a74-5977-435e-8374-59e6f93c1262?id=70194

What can not be concluded from \(A\mathbf{x} = \mathbf{b}\) is (in)consistent?

Grasple Exercise 2.5.28

https://embed.grasple.com/exercises/c395902a-87aa-4858-915b-7ddc5513cb85?id=87398

What about linear combinations of (three) linearly independent vectors?

Grasple Exercise 2.5.29

https://embed.grasple.com/exercises/8f89db8f-777f-4f11-9e09-23ddacf7a08d?id=87428

What about linear combinations of (four) linearly independent vectors?

Grasple Exercise 2.5.30

https://embed.grasple.com/exercises/43810ef4-d9c7-4097-8d05-f91dd67bbb43?id=68868

What about subsets and unions of sets of linearly independent vectors?