# 2.12 Newton’s Method

Newton’s method is a numerical technique for solving equations of the form

[2.136]

where *f* : ^{n} → ^{n} is differentiable. It starts with an initial guess or “seed” value *x*^{[1]}, which the user supplies. Based upon this, the procedure recursively generates a sequence of values *x*^{[2]}, *x*^{[3]}, *x*^{[4]}, … , which should converge to a solution.

###### 2.12.1 Univariate Newton’s Method

In the univariate case, we consider *f* : → and seek a value x such that

[2.137]

If *f* is nonlinear, this may be difficult to solve directly. However, we can fit a linear polynomial approximation to *f* and easily solve

[2.138]

The solution approximates a solution to the original equation [2.137]. The quality of the approximation depends upon how we construct . Newton’s method repeats this process recursively, at each step improving its approximation of *f* until the solution for [2.138] comes sufficiently close to a solution for [2.137].

Start with seed value *x*^{[1]}. Fit a tangent-line approximation ^{[1]} to *f* :

[2.139]

and select *x*^{[2]} as the solution to

[2.140]

This is illustrated in Exhibit 2.16.

*x*

^{[2]}is obtained from

*x*

^{[1]}by fitting a tangent-line approximation to the function

*f*.

Repeat the process at *x*^{[2]} to obtain ^{[2]} and *x*^{[3]}. Subsequent ^{[k]} and *x*^{[k + 1]} are obtained recursively in this manner. In general

[2.141]

Setting this equal to 0 and solving for *x*^{[k + 1]} yields

[2.142]

This is the general recursive formula for the univariate Newton’s method.

###### 2.12.2 Example: Univariate Newton’s Method

Consider the equation

[2.143]

In this case,

[2.144]

and

[2.145]

Starting with seed value *x*^{[1]} = 1, apply [2.142] recursively. Results are indicated in Exhibit 2.17. Equation [2.143] has solution *x* = 1.19089.

###### 2.12.3 Caveats

The univariate Newton’s method is easy to implement and usually converges rapidly. In theory, it can fail. This can happen if *f* ′(*x*^{[k]}) = 0 for some *k* or if *f* has an inconvenient shape. Such a circumstance is depicted in Exhibit 2.18. These problems may be solved by starting at a different seed value *x*^{[1]}.

Equation [2.137] need not have any solutions, so failure of Newton’s method to converge may indicate that no solutions exist. Alternatively, [2.137] may have multiple solutions. For any given seed value, Newton’s method will find only one solution. Starting with other seed values may allow us to find others, but Newton’s method may converge to the same solution found with the first seed value. In this situation, Newton’s method must be supplemented with other analyses to ensure all solutions are found. Suitable analyses will depend upon the form of Equation [2.137].

###### 2.12.4 Secant Method

If an analytic expression for the derivative of *f* is unavailable, it can be approximated using finite differences. Another alternative is the **secant method**, which is a modification of Newton’s method. Using the same general approach as Newton’s method, it replaces tangent lines with secant lines interpolated between consecutive points *x*^{[k − 1]} and *x*^{[k]}, as illustrated in Exhibit 2.19.

In recursive formula [2.142] of Newton’s method, the derivative *f* ′(*x*^{[k]}) is replaced with the secant line slope

[2.146]

Given values *x*^{[k − 1]} and *x*^{[k]}, subsequent value *x*^{[k + 1]} is obtained as

[2.147]

The secant method is initialized with two seed values *x*^{[1]} and *x*^{[2]}. The function *f* need not be differentiable. As long as it is continuous and reasonably well behaved, the secant method generally performs well, but convergence issues similar to those with Newton’s method arise.

###### Exercises

Use Newton’s method with seed value *x*^{[1]} = 0 to solve the equation:

[2.148]

Use the secant method with seed values *x*^{[1]} = 0 and *x*^{[2]} = 1 to solve the same equation as in the previous exercise.