You do not need to do the next command when using the IPython notebook on SageMathCloud, but it may be needed if you open this notebook on another system where the defaults are set differently. It does "from pylab import *" so numpy and plotting commands are available, and sets it up for plots to appear in the notebook webpage
In [128]:
%pylab inline
This is written in a more general form than the code used in the lectures. The function fvals_sqrt(x)
returns the value of $f(x) = x^2 -2$ and the value of the derivative $f'(x) = 2x$ for any $x$ (or for any numpy array of x values it returns component-wise arrays, used below when evaluating at lots of points for plotting purposes).
In [120]:
def fvals_sqrt2(x):
f = x**2 - 2.
fprime = 2.*x
return f,fprime
Here's the code for doing the general Newton iteration starting from some initial guess x0
and iterating until f(xk)
is sufficiently close to 0, defined by the parameter tol
.
In [121]:
def newton(fvals, x0, tol):
xk = x0
kmax = 30 # maximum number of iterations
print " k xk f(xk)"
for k in range(kmax):
fxk, fpxk = fvals(xk) # evaluate f(x) and f'(x)
print "%4i %22.15f %22.15f" % (k, xk, fxk)
if abs(fxk) < tol:
break #leave the loop if tolerance satisfied
xk = xk - fxk / fpxk # update xk using Newton's method
return xk
Try it out...
In [122]:
x_newton = newton(fvals_sqrt2, 5., 1e-5)
print "The true value is %22.15f" % sqrt(2.)
print "The absolute error is %22.15e" % (abs(x_newton) - sqrt(2.))
This next function takes as input some value xk
(representing $x_k$ in an iteration of Newton's method) and returns $x_{k+1}$. It also plots the function along with the tangent line at xk
that is used to define the next iterate. It plots the function and tangent line over an interval including the two points and adds a legend that contains the two values.
In [123]:
def newton_step_plot(xk, fvals):
fxk, fpxk = fvals(xk)
xkp1 = xk - fxk / fpxk # calculate x_{k+1} from x_k
xdiff = abs(xkp1 - xk)
xlower = min(xk,xkp1) - xdiff
xupper = max(xk,xkp1) + xdiff
x = linspace(xlower,xupper,500)
fx, fpx = fvals(x)
tangent_line = fxk + (x-xk)*fpxk
plot(x, fx, 'b')
plot(x, tangent_line, 'r')
plot([xk], [fxk], 'ro', label="x_k = %g" % xk)
plot([xkp1], [0.], 'go', label="x_{k+1} = %g" % xkp1)
plot([xlower, xupper], [0,0], 'k')
legend(loc='best') # uses the labels specified in the plot commands above
axis('tight')
return xkp1
Try this out: You can change the initial guess and/or the number of iterations taken to see how it behaves.
In [124]:
xk = 5. # initial guess
for i in range(5):
figure() # so each plot is a separate figure rather than on top of each other
xk = newton_step_plot(xk, fvals_sqrt2)
In [125]:
def fvals(x):
f = x**3 - 6*x**2 + 11*x - 5
fprime = 3*x**2 - 12*x + 11
return f, fprime
# Plot the function over a suitable range to get an idea of where the zeros are:
x = linspace(-2,5,1000)
fx, fpx = fvals(x)
plot(x,fx)
plot(x,zeros(len(x)), 'k') # x-axis
ylim(-5,5)
Out[125]:
Based on the plot we can choose an initial guess and take a few iterations:
In [126]:
xk = 1.
for i in range(5):
figure() # so each plot is a separate figure rather than on top of each other
xk = newton_step_plot(xk, fvals)
Check that we found an approximate root:
In [127]:
fxk, fpxk = fvals(xk)
print "At xk = %22.15e, f(xk) = %22.15e " % (xk,fxk)
You might try:
In [ ]: