Skip to main content

Applied Mathematics

Section 4.2 The Span and Basis of a Subspace

In SectionΒ 2.5, we saw the span of vectors in \(\mathbb{R}^n\text{.}\) We now extend this example to the span of any subspace. In addition, the notion of a basis of the subspace is introduced.

Subsection 4.2.1 Linear Independence

In SectionΒ 2.6, the linear independence of vectors in \(\mathbb{R}^n\) were defined and many examples shown. We extend this notion to vectors in a general vector space.

Definition 4.2.1.

Let \(S = \left\{\boldsymbol{v}_1, \boldsymbol{v}_2, \ldots \boldsymbol{v}_n\right\}\) be a set of vectors in a vectors space \(V\text{.}\) The set \(S\) is linear independent if for \(c_1, c_2, \ldots, c_n \in \mathbb{R}\text{,}\) the equation \(c_1 \boldsymbol{v}_1 + c_2 \boldsymbol{v}_2 + \cdots + c_n \boldsymbol{v}_n = 0\) implies that \(c_1 = c_2 = \cdots = c_n = 0\text{.}\) If not, then the set is linearly dependent.

Example 4.2.2.

Show that the set
\begin{equation*} S = \left\{ 1, x, x^2 \right\} \end{equation*}
is linearly independent in \({\cal P}_2\text{.}\)
Solution.
To show this, we need to show that if
\begin{equation*} c_1 (1) + c_2 (x) + c_3 (x^2) = 0 \end{equation*}
then \(c_1 = c_2 = c_3 = 0\text{.}\) Note that the zero on the right hand side is the zero polynomial. Thus, equating coefficients, we get the system
\begin{align*} c_1 \amp = 0 \\ c_2 \amp = 0 \\ c_3 \amp = 0 \end{align*}
which has the only solution of \(c_1 = c_2 = c_3 = 0\text{,}\) thus the set is linearly independent.

Example 4.2.3.

Are the matrices in the set
\begin{equation*} \left\{ \begin{bmatrix} 1 \amp 1 \\ 1 \amp 1 \end{bmatrix}, \begin{bmatrix} 1 \amp -1 \\ 1 \amp -1 \end{bmatrix}, \begin{bmatrix} 2 \amp 2 \\ -2 \amp -2 \end{bmatrix} \right\} \end{equation*}
linearly independent or dependent in \({\cal M}_{2 \times 2}\text{?}\)
Solution.
To determine this, we need to solve
\begin{equation*} c_1 \begin{bmatrix} 1 \amp 1 \\ 1 \amp 1 \end{bmatrix} + c_2 \begin{bmatrix} 1 \amp -1 \\ 1 \amp -1 \end{bmatrix} + c_3 \begin{bmatrix} 2 \amp 2 \\ -2 \amp -2 \end{bmatrix} = \begin{bmatrix} 0 \amp 0 \\ 0 \amp 0 \end{bmatrix} \end{equation*}
The solution of this is by equating terms or the system:
\begin{equation*} \begin{aligned} c_1 + c_2 + 2c_3 \amp = 0 \\ c_1 - c_2 + 2c_3 \amp = 0 \\ c_1 + c_2 - 2c_3 \amp = 0 \\ c_1 - c_2 - 2c_3 \amp = 0 \end{aligned} \end{equation*}
This system has the unique solution \(c_1 = c_2 = c_3 = 0\text{,}\) thus the set is linearly independent.

Subsection 4.2.2 Subspaces

Recall that it is important to understand if a set of vectors is a span of a set or space, because we are able to take a linear combination of the vectors to get any other vector in the set. Since we have extended the notion of a vector space, the span will play the same role.

Definition 4.2.4.

Let \(U\) be a subset of a vector space \(V\text{.}\) If \(U\) is also a vector space, then \(U\) is a subspace.
The next three examples show that we have already seen subspaces because we know subsets of known vector spaces that are vector spaces themselves.

Example 4.2.5.

We showed in ExampleΒ 4.1.3 that the set of all lines in \(\mathbb{R}^2\) that pass through the origin is a vector space. Since the set is a subset of \(\mathbb{R}^2\text{,}\) it is a subspace of \(\mathbb{R}^2\) as well.

Example 4.2.6.

Show that \(\mathbb{R}^2\) is a subspace of \(\mathbb{R}^3\text{.}\)
Solution.
Since \(\mathbb{R}^2\) is itself a vector space and a subset of \(\mathbb{R}^3\text{,}\) then \(\mathbb{R}^2\) is a subspace.

Example 4.2.7.

Recall that the set
\begin{equation*} {\cal P}_2=\{ a_0 + a_1 x + a_2 x^2\; | \; a_0, a_1, a_2 \in \mathbb{R} \} \end{equation*}
is the set of all quadratic functions.
The set \({\cal P}_1 = \{a_0 + a_1 x \; | \; a_0, a_1 \in \mathbb{R} \}\) of all linear functions is itself a vector space as well as a subset of \({\cal P}_2\text{,}\) therefore \({\cal P}_1\) is a subspace of \({\cal P}_2\text{.}\)
In addition, the set \(\{ a x^2\; | \; a \in \mathbb{R}\}\) is a vector space as well as a subset of \({\cal P}_2\text{,}\) therefore it is a subspace.
The above examples show that there are many already known subspaces. There are many cases though that aren’t evident or to show it is a subspace, we would need to prove all 10 properties that it is a vector space. The next lemma, however, shows that isn’t the case.

Proof.

This means that if \(S\) is a subset of \(V\text{,}\) a vector space, to prove that \(S\) is a subspace, we only need to check if \(r_1 \boldsymbol{s}_1 + r_2 \boldsymbol{s}_2 \in S\text{.}\)
Since \(S\) is a subspace of \(V\text{,}\) properties (2), (3), (5) and (7)-(10) of DefinitionΒ 4.1.1 hold for \(S\text{.}\) Thus we only need to prove closure under addition and scalar multiplication as well as the existence of the identity element.
Property 1: Because \(r_1 \boldsymbol{s}_1 + r_2 \boldsymbol{s}_2 \in S\text{,}\) let \(r_1=r_2=1\text{,}\) thus \(\boldsymbol{s}_1+\boldsymbol{s}_2 \in S\text{.}\)
Property 4: If \(r_1=0\) and \(r_2=0\text{,}\) then this shows that \(\boldsymbol{0} \in S\text{,}\) so there is an identity element.
Property 6: Because \(r_1 \boldsymbol{s}_1 + r_2 \boldsymbol{s}_2 \in S\text{,}\) let \(r_2=0\text{,}\) thus \(r_1 \boldsymbol{s}_1 \in S\text{.}\)
We will use this definition to prove that certain sets are subspaces.

Example 4.2.9.

Show that
\begin{equation*} V = \left\{ \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} \; | \; v_2 = k v_1 \right\} \end{equation*}
(that is, all vectors on a line of slope \(k\)) is a subspace of \(\mathbb{R}^2\text{.}\)
Solution.
We will use LemmaΒ 4.2.8. Let
\begin{align*} \boldsymbol{u} \amp = \begin{bmatrix} u_1 \\ u_2 \end{bmatrix} \amp \boldsymbol{v} \amp = \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} \end{align*}
be elements of \(V\text{.}\) That is \(v_2 = k v_1\) and \(u_2 = ku_1\text{.}\) Then
\begin{align*} r_1 \boldsymbol{u} + r_2 \boldsymbol{v} \amp = r_1 \begin{bmatrix} u_1 \\ k u_1 \end{bmatrix} + r_2 \begin{bmatrix} v_1 \\ k v_1 \end{bmatrix} = \begin{bmatrix} r_1 u_1 + r_2 v_1 \\ r_1 k u_1 + r_2 k v_1 \end{bmatrix} \\ \amp = \begin{bmatrix} r_1 u_1 + r_2 v_1 \\ k (r_1 u_1 + r_2 v_1) \end{bmatrix} \end{align*}
which is an element of \(V\) because the second component is \(k\) times the first one. Thus \(V\) is a subspace of \(\mathbb{R}^2\text{.}\)

Example 4.2.10.

Show using LemmaΒ 4.2.8 that
\begin{equation*} V =\left\{ \begin{bmatrix} a \amp 0 \\ 0 \amp b \end{bmatrix} \; | \; a,b \in \mathbb{R} \right\} \end{equation*}
the set of all diagonal matrices is a subspace of \({\cal M}_{2 \times 2}\text{,}\) the vector space of all \(2 \times 2\) matrices.
Solution.
In this case, if we show that for any two matrices
\begin{align*} A \amp = \begin{bmatrix} a_1 \amp 0 \\ 0 \amp b_1 \end{bmatrix}, \amp B \amp = \begin{bmatrix} a_2 \amp 0 \\ 0 \amp b_2 \end{bmatrix} \end{align*}
and scalars \(r_1, r_2 \in \mathbb{R}\) that \(r_1 A + r_2 B\) is in the set.
\begin{align*} r_1 A + r_2 B \amp = r_1 \begin{bmatrix} a_1 \amp 0 \\ 0 \amp b_1 \end{bmatrix}+ r_2 \begin{bmatrix} a_2 \amp 0 \\ 0 \amp b_2 \end{bmatrix} \\ \amp = \begin{bmatrix} r_1 a_1 + r_2 a_2 \amp 0 \\ 0 \amp r_1 b_1 + r_2 b_2 \end{bmatrix} \end{align*}
which is a diagonal matrix, therefore in \(V\text{,}\) thus this is a subspace.
In SectionΒ 2.7, we explored the null space of a matrix. In the next lemma, we show that any null space is a subspace.

Proof.

We will use LemmaΒ 4.2.8 to solve this. Let both \(\boldsymbol{x}\) and \(\boldsymbol{y}\) be in the null space of \(A\text{.}\) This means that \(A\boldsymbol{x}=\boldsymbol{0}\) and \(A\boldsymbol{y}=\boldsymbol{0}\text{.}\) We need to show that \(r_1 \boldsymbol{x} + r_2 \boldsymbol{y}\) is in the null space of \(A\text{.}\)
\begin{align*} A(r_1 \boldsymbol{x} + r_2 \boldsymbol{y}) \amp = r_1 A\boldsymbol{x} + r_2 A\boldsymbol{y} \\ \amp = r_1 (\boldsymbol{0}) + r_2 (\boldsymbol{0}) = \boldsymbol{0} \end{align*}
Vectors in the null space are vectors of length \(n\text{,}\) so the null space is a subset of \(\mathbb{R}^n\) and since \(r_1 \boldsymbol{x} + r_2 \boldsymbol{y}\) is in the null space of \(A\text{,}\) then the null space is a subspace of \(\mathbb{R}^n\text{.}\)
This is an important result that we will see in eigenvalues in the next chapter.

Subsection 4.2.3 The Span of a set of vectors

We saw in SectionΒ 2.5 the span of a set of vectors in \(\mathbb{R}^n\text{.}\) We now generalize this to any vector space.

Definition 4.2.12.

The span of a nonempty subset \(S=\{\boldsymbol{s}_1, \boldsymbol{s}_2, \ldots, \boldsymbol{s}_n\}\) of a vector space is the set of all linear combinations of the vectors in \(S\text{.}\) That is,
\begin{equation*} \text{span}(S) = \{ c_1 \boldsymbol{s}_1 + c_2 \boldsymbol{s}_2 + \cdots + c_n \boldsymbol{s}_n\; | \; \text{$c_1, c_2, \ldots, c_n \in \mathbb{R}, \boldsymbol{s}_1, \boldsymbol{s}_2, \ldots, \boldsymbol{s}_n \in S$} \}. \end{equation*}
To show that a subset of vectors span a subspace \(S\text{,}\) we need to show that any vector in \(S\) can be written as a linear combination of the spanning vectors.

Example 4.2.13.

Show that the set \(\{2+x,1,x+x^2\}\) spans \(\mathcal{P}_2\text{.}\)
Solution.
In this case, we need show that a general polynomial in \(\mathcal{P}_2\) can be written as a linear combination of elements of the given set. That is
\begin{equation*} c_1 (2+x) + c_2 (1) + c_3 (x+x^2) = a_0 + a_1 x + a_2 x^2 \end{equation*}
and if there is a solution for the \(c\)’s, then that shows the the set spans \(\mathcal{P}_2\text{.}\) To find the solution, use the technique of equating coefficients. Write down the coefficients for the constant terms, \(x\) terms and \(x^2\) terms respectively.
\begin{align*} 2 c_1 + c_2 \amp = a_0 \\ c_1 + c_3 \amp = a_1 \\ c_3 \amp = a_2 \end{align*}
This has a solution \(c_3=a_2, c_1 = a_1-a_2\) and \(c_2 = a_0 - 2(a_1-a_2)\text{,}\) which means that a linear combination of the three β€œvectors” can form any quadratic function, thus the given set spans \(\mathcal{P}_2\text{.}\)

Proof.

Let \(S\) be the subset and \(\boldsymbol{s}_1, \boldsymbol{s}_2, \ldots, \boldsymbol{s}_n\) be the elements of \(S\text{.}\) Using LemmaΒ 4.2.8, we need to check that \(\text{span}(S)\) is closed under linear combinations. Let
\begin{align*} \boldsymbol{v} \amp = c_1 \boldsymbol{s}_1 + c_2 \boldsymbol{s}_2 + \cdots + c_n \boldsymbol{s}_n, \\ \boldsymbol{w} \amp = k_1 \boldsymbol{s}_1 + k_2 \boldsymbol{s}_2 + \cdots + k_n \boldsymbol{s}_n \end{align*}
\begin{align*} r_1 \boldsymbol{v} + r_2 \boldsymbol{w} \amp = r_1 (c_1 \boldsymbol{s}_1 + c_2 \boldsymbol{s}_2 + \cdots + c_n \boldsymbol{s}_n) + r_2 (k_1 \boldsymbol{s}_1 + k_2 \boldsymbol{s}_2 + \cdots + k_n \boldsymbol{s}_n) \\ \amp = (r_1 c_1 + r_2 k_1) \boldsymbol{s}_1 + (r_1c_2 + r_2 k_2) \boldsymbol{s}_2 + \cdots + (r_1 c_n + r_2 k_n) \boldsymbol{s}_n \end{align*}
Since this shows that \(r_1 \boldsymbol{v} + r_2 \boldsymbol{w}\) is in \(S\text{,}\) then \(S\) is a subspace.
This lemma allows us to talk about a vector space in terms of the vectors that span it. For example, instead of thinking of \(\mathcal{P}_2\text{,}\)we think of the span of \(\{2+x,1,x+x^2\}\) (in this case, it may not be more helpful, but other cases it is).

Example 4.2.15.

Show that the following vectors span \(\mathbb{R}^3\text{:}\)
\begin{align*} \boldsymbol{e}_1 \amp = \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix}, \amp \boldsymbol{e}_2 \amp = \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix}, \amp \boldsymbol{e}_3 \amp = \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \end{align*}
Solution.
Because the vector \(\boldsymbol{x}=[x_1,x_2,x_3]^{\intercal}\) can be written \(\boldsymbol{x}=x_1 \boldsymbol{e}_1 + x_2 \boldsymbol{e}_2 + x_3 \boldsymbol{e}_3\text{,}\) then these vectors span \(\mathbb{R}^3\text{.}\)

Example 4.2.16.

Does \(\{2+x,x^2 \}\) span \(\mathcal{P}_2\text{?}\)
Solution.
To determine this, we will try write a general polynomial in \(\mathcal{P}_2\text{,}\)
\begin{equation*} a_0 + a_1 x + a_2 x^2 \end{equation*}
as a linear combination of the set of vectors or
\begin{equation*} a_0 + a_1 x + a_2 x^2 = c_1 (2+x) + c_2 x^2 \end{equation*}
and equating coefficients,
\begin{align*} a_0 \amp = 2c_1 \\ a_1 \amp = c_1 \\ a_2 \amp = c_2 \end{align*}
There’s no solution to this because \(c_1\) can’t simultaneously equal \(a_1\) and \(a_0/2\text{,}\) so \(\{2+x,x^2\}\) does not span \(\mathcal{P}_2\text{.}\)