Newton's method uses a clever insight to iteratively home in on the root of a function $f$. The central idea is to approximate $f$ by its tangent at some initial position $x_0$:
$$ y = f'(x_0) (x-x_0) + f(x_0) $$The $x$-intercept of this line is then closer to the root than the starting position $x_0$. That is, we need to solve the linear relation
$$ f'(x_n)(x_1-x_0) + f(x_0) $$for the updated position $x_1 = x_0 - f(x_0)/f'(x_0)$. Repeating this sequence
$$ x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)} $$will yield a fixed point, which is the root of $f$ if one exists in the vicinity of $x_0$.
In [2]:
def newtons_method(f, df, x0, tol=1E-6):
x_n = x0
while abs(f(x_n)) > tol:
x_n = x_n - f(x_n)/df(x_n)
return x_n
In [10]:
# define a test function
def f(x):
return (x-3)**2 - 9
def df(x):
return 2*(x-3)
def df2(x):
return 2.
In [14]:
root = newtons_method(f, df, x0=0.1)
print ("root {0}, f(root) = {1}".format(root, f(root)))
In [16]:
minimum = newtons_method(df, df2, x0=0.1)
print ("minimum {0}, f'(minimum) = {1}".format(minimum, df(minimum)))
There is an important qualifier in the statement about fixed points: a root needs to exist in the vicinity of $x_0$! Let's see what happens if that's not the case:
In [17]:
def f(x):
return (x-3)**2 + 1
newtons_method(f, df, x0=0.1)
With a little more defensive programming we can make sure that the function will terminate after a given number of iterations:
In [21]:
def newtons_method2(f, df, x0, tol=1E-6, maxiter=100000):
x_n = x0
for _ in range(maxiter):
x_n = x_n - f(x_n)/df(x_n)
if abs(f(x_n)) < tol:
return x_n
raise RuntimeError("Failed to find a minimum within {} iterations ".format(maxiter))
In [22]:
newtons_method2(f, df, x0=0.1)