Skip to main content

Section 3.4 Newton’s Method

The most ubiquitous root finding method is Newton’s Method, which we saw in Chapter 1 for finding \(\sqrt{65}\) by finding the root of \(f(x)=x^{2}-65\text{.}\)
Recall that to find \(\sqrt{a}\text{,}\) we can solve \(f(x)=x^{2}-a=0\) using the method:
\begin{equation*} x_{n+1} = \frac{1}{2}\biggl( x_{n} + \frac{a}{x_{n}}\biggr) \end{equation*}
which converges to \(\sqrt{a}\) quadratically for any starting value \(x_{0} \gt 0\text{.}\) This method is Newton’s method for finding the root of \(f(x)=x^{2}-65\text{.}\)
In general, Newton’s Method to solve \(f(x)=0\) can be seen graphically. First, consider \(x_{0}\text{,}\) a starting point for the method, and plot the function and its tangent at \(x_{0}\text{:}\)
Figure 3.4.1.
The iteration proceeds with \(x_{1}\) being the root of the tangent line. Since the tangent line to \(y=f(x)\) at \(x=a\) is
\begin{equation*} y - f(a) = f'(a) (x-a), \end{equation*}
if we let \(a=x_{0}\text{,}\) \(x=x_{1}\) and \(y=0\) and solve for \(x_{1}\) to get:
\begin{equation*} x_{1} = x_{0} - \frac{f(x_{0})}{f'(x_{0})}, \end{equation*}
or for a general step \(n\) as
\begin{equation} x_{n+1} = x_{n} - \frac{f(x_{n})}{f'(x_{n})}.\tag{3.4.1} \end{equation}
Note that this fits the fixed point iteration scheme with
\begin{equation*} g(x) = x - \frac{f(x)}{f'(x)} \end{equation*}
so all of the analysis performed in Section 3.3 applies to Newton’s with this definition of \(g\text{.}\)

Example 3.4.3.

Use Newton’s method to find the intersection of \(y=\cos x\) and \(y=x\) in the first quadrant.
Solution.
This means that we find the root of \(f(x)=\cos x-x\) and \(f'(x) = -\sin x - 1\text{.}\) Newton’s method for this problem is:
\begin{equation*} \begin{aligned}x_{n+1} \amp = x_{n} - \frac{\cos x_{n} -x_{n}}{-\sin x_{n}-1}\end{aligned} \end{equation*}
If we iterate this method with \(x_{0}=1\) we get the results:
Table 3.4.4. Table of values in Newton’s method for \(f(x)=\cos x - x\)
\(x_{i}\) \(f(x_{i})\)
\(1\) \(0.45969769413186023\)
\(0.7503638678402439\) \(0.018923073822117442\)
\(0.7391128909113617\) \(0.00004645589899088254\)
\(0.7390851333852839\) \(2.847204694234051\times 10^{-10}\)
\(0.7390851332151606\) \(-1.1102230246251565\times 10^{-16}\)
And it appears that the sequence \(\{x_{n}\}\) is converging as well as the function values at the points are approaching zero. We will analyze the convergence and rate of convergence later in this section.

Subsection 3.4.1 Performing Newton’s Method using a Spreadsheet

Newton’s method is relative easy to calculate with a hand calculator if you know the derivative. You can use a spreadsheet to simplify much of the tedium of repeating calculations as the following example shows. We will reproduce the results in Example ex:newtons:method.
First, start with the following spreadsheet:
A B C
1 \(x\) \(f(x)\) \(f'(x)\)
2 1 0.459697694 -1.84147
where
  • cell A2 is the number 1,
  • cell B2 is the formula =cos(a2)-a2,
  • cell C2 is the formula =-sin(a2)-1, the derivative of the cell in B2
Next, in cell A3, enter the formula for Newton’s method or =A2-B2/C2. In cells B3 and C3, drag (or copy) down from cells B2 and C2. You should see:
Table 3.4.5. TITLE
A B C
1 \(x\) \(f(x)\) \(f'(x)\)
2 1 0.459697694 -1.84147
3 0.7503639 -0.0189231 -1.6819050
and if the last row is dragged down a few rows, you will get the table in Example 3.4.3.

Subsection 3.4.2 Analysis of Newton’s Method

Recall that Newton’s Method fits the iteration form of Section 3.3 with
\begin{equation*} g(x) = x - \frac{f(x)}{f'(x)} \end{equation*}
Using this, we can show that Newton’s method is a quadratically converging method if \(f(a)=0\) and \(f'(a) \neq 0\text{.}\) We will use Theorem 3.3.11 to show that it convergence quadratically, that is that \(g(a)=a, g'(a)=0\) and \(g''(a) \neq 0\) or
\begin{equation*} \begin{aligned}g(a) \amp = a - \frac{f(a)}{f'(a)}= a, \\ g'(x) \amp = 1 - \frac{(f'(x))^{2} - f(x) f''(x)}{(f'(x))^{2}}=\frac{f(x)f''(x)}{(f'(x))^{2}}, \\ g'(a) \amp = \frac{f(a)f''(a)}{(f'(a))^{2}}= 0, \\ g''(x) \amp = \frac{(f'(x))^{2}[f(x)f'''(x)+f'(x)f''(x)] - 2 f'(x) f(x)f'''(x)}{(f'(x))^{4}}\\ g''(a) \amp = \frac{f''(a)}{f'(a)}\end{aligned} \end{equation*}
and if \(f''(a) \neq 0\text{,}\) then \(g(a)=a, g'(a)=0, g''(a) \neq 0\text{.}\)
Lastly, the asymptotic error constant can be found using Theorem 3.3.11 as
\begin{equation} \lambda = \frac{g''(a)}{2}= \frac{f''(a)}{2f'(a)}\tag{3.4.2} \end{equation}

Example 3.4.6.

Find the formula for Newton’s method for the function \(f(x)=x^{2}-a\text{,}\) for \(a \gt 0\text{.}\) In addition, find the asymptotic error constant.
Solution.
In this case, using Newton’s Method from (3.4.1),
\begin{equation*} \begin{aligned}x_{n+1} \amp = x_{n} - \frac{x_{n}^{2}-a}{2x_{n}}\\ \amp = \frac{2x_{n}^{2} - (x_{n}^{2}-a)}{2x_{n}}\\ \amp = \frac{x_{n}^{2} + a}{2x_{n}}\end{aligned} \end{equation*}
Also, the asymptotic error constant is found by using (3.4.2),
\begin{equation*} \lambda = \frac{1}{2!}\frac{f''(a)}{f'(a)}= \frac{2}{2 \cdot 2\sqrt{a}}= \frac{1}{2 \sqrt{a}} \end{equation*}
which we have shown most recently in Example 3.3.12.

Subsection 3.4.3 Newton’s Method for roots of multiplicity greater than 1

We start with an example showing that Newton’s method does not always converge quadratically.

Example 3.4.7.

Apply Newton’s method to the function \(f(x) = e^x-x-1\) starting with \(x_0=0.5\)
Solution.
In this case, \(f'(x) = e^x-1\) and therefore Newton’s method is
\begin{equation} x_{n+1} = x_n - \frac{e^{x_n} - x_n -1}{e^{x_n}-1}\tag{3.4.3} \end{equation}
The following table shows this:
\(n\) \(x_n\)
0 0.5
1 0.25480007759114687
2 0.1802096055020658
3 0.14103188077228046
4 0.11641077194721548
5 0.0993613061968627
6 0.08679954162860683
7 0.07713325770653695
8 0.06945102448715239
9 0.0631910576041048
10 0.05798716463299298
It appears that this sequence is converging to 0 (note that \(e^0-0-1=0\)), however it also appears to be converging very slowly, but can be shown numerically that the convergence is linear.
What’s going on with this? Why so slow. Notice that the method in (3.4.3), the term in the determinator is going to 0, and in fact, if we plug in, then we have a division by zero error.
Even though this is Newton’s method, if we use Theorem 3.3.11 with \(g(x) = x-(e^x-x-1)/(e^x-1)\text{,}\) note first that \(g(0)\) does not exist, however,
\begin{equation*} \lim_{x \to 0} g(x) = 0 \end{equation*}
and
\begin{equation*} g'(x) = -\frac{{\mathrm{e}}^x \,{\left(x-{\mathrm{e}}^x +1\right)}}{{{\left({\mathrm{e}}^x -1\right)}}^2 } \end{equation*}
and again \(g'(0)\) is not defined, but
\begin{equation*} \lim_{x \to 0} g'(x) = \frac{1}{2} \end{equation*}
which would show this is a linear convergence method.
The issue with this can be seen if you recall that the denominator of Newton’s method is the derivative. For the function \(f(x) = e^x - x -1\) has \(f'(x) = e^x - 1\text{.}\) If we use Theorem 3.1.7, with the result that
\begin{equation*} \begin{aligned} f(0) \amp = 0, \amp f'(0) \amp = 0, \amp f''(0) \amp = 1 \neq 0 \end{aligned} \end{equation*}
which shows that \(f(x)\) has a root of multiplicity 2 at \(x=0\text{.}\)