Skip to main content

Applied Mathematics

Section 2.7 Row Space, Column Space and Rank of a Matrix

If we examine a matrix, we can think about the number of linearly independent row or columns in a matrix or also the span of the set of rows and columns in a matrix. We will see how these concepts are connected to other concepts from this chapter.

Subsection 2.7.1 Row Space of a Matrix

Definition 2.7.1.

The row space of a matrix is the span of the set of its rows. The row rank is the number of linearly independent rows of the matrix.
The following example finds the row space and row rank of a \(2 \times 2\) matrix.

Example 2.7.2.

The row space of
\begin{equation*} \begin{bmatrix} 2 \amp 1 \\ 1 \amp 0 \end{bmatrix} \end{equation*}
is the set of all \(2\)-component row vectors with the form:
\begin{equation*} \left\{ c_1 \cdot \begin{bmatrix} 2 \amp 1 \end{bmatrix} + c_2 \begin{bmatrix} 1 \amp 0 \end{bmatrix}, \; | \, c_1, c_2 \in \mathbb{R} \right\} \end{equation*}
and this is the set of all \(2\)-component row vectors. The row rank of this matrix is 2 because these row vectors are linearly independent.
We can calculate the row space and row rank of non-square matrices as is shown in the following example.

Example 2.7.3.

The row space of
\begin{equation*} \begin{bmatrix} 2 \amp 1 \amp -4 \\ 4 \amp 2 \amp -8 \end{bmatrix} \end{equation*}
can be written as
\begin{equation*} \{ c_1 \cdot \begin{bmatrix} 2 \amp 1 \amp -4 \end{bmatrix} + c_2 \begin{bmatrix} 4 \amp 2 \amp -8 \end{bmatrix} \; | \; c_1, c_2 \in \mathbb{R} \} \end{equation*}
However, since the two vectors are not linearly independent, the row space can be written as
\begin{equation*} \{ c_1 \cdot \begin{bmatrix} 2 \amp 1 \amp -4 \end{bmatrix}\; | \; c_1 \in \mathbb{R} \} \end{equation*}
Therefore the row rank is 1.

Proof.

By CorollaryΒ 1.3.18, each row of \(B\) is a linear combination of each row of \(A\text{,}\) therefore it is in the row space of \(A\) or \(\text{rowspace}(B) \subset \text{rowspace}(A)\text{.}\)
Similarly, each row of \(A\) is a linear combination of \(B\text{,}\) so using the same argument, \(\text{rowspace}(A) \subset \text{rowspace}(B\)), therefore \(\text{rowspace}(A)=\text{rowspace}(B)\text{.}\)

Proof.

Although this proof is almost trivial, the main point of this is to reframe what we saw in the first chapter in terms of this newer framework.

Remark 2.7.6.

Gaussian Reduction to echelon form eliminates linear dependence between the rows, leaves the row space unchanged, and results in a set of linearly independent rows whose span is the row space.

Example 2.7.7.

Consider the linear system (written as a matrix) from Example ExampleΒ 1.3.7:
\begin{equation*} \begin{bmatrix} 4 \amp 0 \amp -1 \amp 0 \\ 1 \amp 3 \amp 2 \amp 3 \\ 0 \amp 3 \amp 5 \amp 14 \end{bmatrix} \end{equation*}
Perform row operations to put it in echelon form and find the row rank.
Solution.
The row operations are
\begin{equation*} \begin{array}{r} -R_1+4R_2 \rightarrow R_2, \\ -R_2 +4R_3 \rightarrow R_3 \end{array} \qquad \begin{bmatrix} 4 \amp 0 \amp -1 \amp 0\\ 0 \amp 12 \amp 9 \amp 12\\ 0 \amp 0 \amp 11 \amp 44\\ \end{bmatrix} \end{equation*}
Since this is in echelon form now, there are three linearly independent rows, so the row rank is 3.

Subsection 2.7.2 Column Space of a Matrix

Definition 2.7.8.

The column space of a matrix is the span of the set of its columns. The column rank is the number of linearly independent columns of the matrix.

Example 2.7.9.

Find the column space and column rank of
\begin{equation*} A = \begin{bmatrix} 1 \amp 3\\ -5 \amp 4\\ 1 \amp 3 \\ \end{bmatrix} \end{equation*}
Solution.
In this case, since the columns of the matrix are linearly independent (because one can see that they are not multiples of one another), the column space is
\begin{equation*} \left\{ c_1 \begin{bmatrix} 1 \\ -5 \\ 1\end{bmatrix} + c_2 \begin{bmatrix}3 \\ 4 \\ 3 \end{bmatrix} \; | \; c_1, c_2 \in \mathbb{R}\right\} \end{equation*}
which is the span of the columns. And since there are two linearly independent vectors, the column rank is 2.
Instead of trying to eliminate any linear dependence as written, note that if we treat the columns as rows. Recall that this operation is called the transpose of the matrix.

Remark 2.7.10.

To find the column space of a matrix \(A\text{,}\) find the row space of \(A^{\intercal}\) and transpose the resulting row space.

Example 2.7.11.

Find the column space and column rank of
\begin{equation*} A = \begin{bmatrix} 1 \amp 3 \amp 1\\ 2 \amp 0 \amp -4\\ 0 \amp 1 \amp 1 \\ 3 \amp 4 \amp -2 \end{bmatrix} \end{equation*}
Solution.
  1. First take the transpose
    \begin{equation*} A^{\intercal} = \begin{bmatrix} 1 \amp 2 \amp 0 \amp 3 \\ 3 \amp 0 \amp 1 \amp 4 \\ 1 \amp -4 \amp 1 \amp -2 \end{bmatrix} \end{equation*}
  2. Now row reduce the matrix:
    \begin{align*} \begin{array}{r} -3R_1 + R_2 \rightarrow R_2, \\ -R_1 + R_3 \rightarrow R_3, \end{array} \qquad \begin{bmatrix} 1\amp 2 \amp 0 \amp 3 \\ 0 \amp -6 \amp1 \amp -5 \\ 0 \amp -6 \amp 1 \amp -5 \end{bmatrix} \\ -R_2 + R_3 \rightarrow R_3 \qquad \begin{bmatrix} 1\amp 2 \amp 0 \amp 3 \\ 0 \amp -6 \amp1 \amp -5 \\ 0 \amp 0 \amp 0 \amp 0 \\ \end{bmatrix} \end{align*}
    which is now in echelon form, so we know that the non-zero rows are linearly independent.
  3. Take the transpose of the first two rows. The span of this is the column space:
    \begin{equation*} \text{colspace}(A) = \text{span}\left(\left\{ \begin{bmatrix} 1 \\ 2 \\ 0 \\ 3 \end{bmatrix}, \begin{bmatrix} 0 \\ -6 \\ 1 \\ -5 \end{bmatrix} \right\} \right) \end{equation*}
And since there are two linearly independent columns, the column rank is 2.
The following theorem explains the relationship between the row rank and column rank.
And because of this fact (which we won’t prove here), the rank of a matrix is typically the interesting property.

Definition 2.7.13.

The rank of its matrix equals the row or column rank of the matrix.

Subsection 2.7.3 Null Space of a Matrix

Definition 2.7.14.

Let \(\vec{x}\) be any vector that satisfies \(A \vec{x}=\vec{0}\text{.}\) The null space of the matrix \(A\) is the span of all vectors \(\vec{x}\text{.}\) The number of independent vectors in the null space is called the nullity of \(A\text{.}\)

Example 2.7.15.

Find the null space of the matrix
\begin{equation*} A = \begin{bmatrix} 1 \amp 2 \amp 3\\ 2 \amp 3 \amp -1 \end{bmatrix} \end{equation*}
Solution.
We can solve this by solving the homogeneous matrix equation.
\begin{align*} \amp \qquad \left[\begin{array}{rrr|r} 1 \amp 2 \amp 3\amp 0 \\ 2 \amp 3 \amp -1 \amp 0 \end{array}\right] \end{align*}
and performing the row operation \(-2R_1+R_2 \rightarrow R_2\)
\begin{align*} \amp \qquad \left[\begin{array}{rrr|r} 1 \amp 2 \amp 3 \amp 0 \\ 0 \amp -1 \amp -7 \amp 0 \end{array}\right]\\ 2R_2 + R_1 \rightarrow R_1 \amp \qquad \left[\begin{array}{rrr|r} 1 \amp 0 \amp -11 \amp 0 \\ 0 \amp -1 \amp -7 \amp 0 \end{array}\right] \\ -R_2 \rightarrow R_2 \amp \qquad \left[\begin{array}{rrr|r} 1 \amp 0 \amp -11 \amp 0 \\ 0 \amp 1 \amp 7 \amp 0 \end{array}\right] \end{align*}
results in the reduced row-echelon form. The resulting equations are
\begin{align*} x_1 -11 x_3 \amp = 0 \\ x_2 +7x_3 \amp = 0 \end{align*}
and the solution set can be written as in vector form:
\begin{equation*} \left\{ \begin{bmatrix} -11 \\ 7 \\ 1 \end{bmatrix} x_3 \; | \; x_3 \in \mathbb{R} \right\} \end{equation*}
and this set is the null space of the matrix. The null space is
\begin{equation*} \text{span} \left(\left\{\begin{bmatrix} -11 \\ 7 \\ 1 \end{bmatrix}\right\}\right) \end{equation*}
and therefore the dimension of the null space is \(1\text{,}\) and thus the nullity of \(A\) is \(1\text{.}\)

Remark 2.7.16.

To find the null space of a matrix \(A\text{,}\) find the row reduced form the matrix, and find the homogeneous solution. The solution set is the null space.
The basis of the homogeneous solution set is the basis of the null space.
The following example shows a larger example.

Example 2.7.17.

Find the null space of
\begin{equation*} A = \begin{bmatrix} 1 \amp 0 \amp 2 \amp 0 \\ 0 \amp 1 \amp 1 \amp 2 \\ 2 \amp 1 \amp 5 \amp 2 \\ 0 \amp -2 \amp -2 \amp -4 \end{bmatrix} \end{equation*}
Solution.
Again, we solve the homogeneous linear system \(A \vec{x} = \vec{0}\) by finding the reduced row echelon form of \(A\)
\begin{align*} -2R_1 + R_2 \rightarrow R_2 \amp \qquad \begin{bmatrix} 1 \amp 0 \amp 2 \amp 0 \\ 0 \amp 1 \amp 1 \amp 2 \\ 0 \amp 1 \amp 1 \amp 2 \\ 0 \amp -2 \amp -2 \amp -4 \end{bmatrix} \\ \begin{array}{r} -R_2 + R_3 \rightarrow R_3, \\ 2R_2 + R_4 \rightarrow R_4, \end{array} \amp \qquad \begin{bmatrix} 1 \amp 0 \amp 2 \amp 0 \\ 0 \amp 1 \amp 1 \amp 2 \\ 0 \amp 0 \amp 0 \amp 0 \\ 0 \amp 0 \amp 0 \amp 0 \\ \end{bmatrix} \end{align*}
and the corresponding equations are:
\begin{align*} x_1 + 2 x_3 \amp = 0 \\ x_2 + x_3 + 2x_4 \amp = 0 \end{align*}
which has the solution set:
\begin{equation*} \{ \begin{bmatrix} -2 \\ -1 \\ 1 \\ 0 \end{bmatrix} t + \begin{bmatrix} 0 \\ -2 \\ 0 \\ 1 \end{bmatrix} s \; | \; t, s \in \mathbb{R} \} \end{equation*}
and this is the null space of the matrix \(A\text{.}\) Since there are two linearly independent vectors that span the space, the dimension of the null space of the matrix is 2. Thus the nullity of \(A\) is 2.

Subsection 2.7.4 Relationship between the Rank and Nullity

The relationship between the rank and the nullity of a \(m \times n\) matrix \(A\) is summed up in the following theorem.

Example 2.7.19.

Show that the Rank-Nullity Theorem holds for Examples ExampleΒ 2.7.15 and ExampleΒ 2.7.17.
Solution.
In ExampleΒ 2.7.15, the \(2 \times 3\) matrix is reduced to reduced row echelon form and it shows that there are 2 non-zero rows, hence the rank of the matrix is 2. Also in ExampleΒ 2.7.15, the nullity was shown to be 1 and the sum is 3, the number of columns of \(A\text{.}\)
In ExampleΒ 2.7.17, the \(4 \times 4\) matrix is reduced to reduced row echelon form and it shows that there are 2 non-zero rows, and again the rank of the matrix is 2. The example also shows that the nullity is 2 and the sum is 4, the number of columns of \(A\text{.}\)