Let $\{p_n \}_{n=0}^{\infty}$ a sequence such that $L = \lim_{n\rightarrow \infty}p_n $
We follow the textbook and we define the convergence rate, $r$, as the number such that
\begin{equation} \lim_{n \rightarrow \infty} \frac{|p_n - L|}{|p_{n-1}-L|^r} = C. \end{equation}We will provide some notions of how to recognize the convergence rate from a semi-Log plot
From theorem 2.4 in your text book, the fixed point method defines a sequence of values $p_n$ such that
\begin{equation} |p_n - L | < \frac{k^n}{1-k} |p_0 - L| \end{equation}where $k$ is the Lipschitz constant, and $k < 1$.
We can easily rewrite this as \begin{equation} |p_n - L | \lesssim e^{n \log(k)} \end{equation} where $\log(k) <0 $.
A simple check for linear convergence would be to draw a semi-log plot.
In [6]:
n = 100
iteration = [1:n];
Error = e.^(-iteration);
using Gadfly
plot(x =iteration, y = Error, Scale.y_log10)
Out[6]:
If the error curve is a straight line, then your iterative algorithm has linear convergence rate.
This can be explained as follows: take $\log$ at both sides of the inequality \begin{equation} |p_n - L | \lesssim e^{n \log(k)} \end{equation} given that $\log$ is an monotonically increasing function, we have that \begin{equation} log(|p_n - L |) \lesssim n \log(k) \end{equation} where $\log(k)$ is a negative constant.
The semi-Log plot (in y), will plot the $\log$ of the error against the number of iterations $n$, which in this case results in a straight line with slope equals to $\log{k}$.
It is known that under the assumptions on $f$, namely: $f'$ is bounded away from zero close to the root, $f''$ is continous, and the initial guess $p_0$ is close to the root; then the Newton's method will converge quadratically to its limit
In other words, \begin{equation} \lim_{n \rightarrow \infty} \frac{|p_n - L|}{|p_{n-1}-L|^2} = C, \end{equation} which implies that \begin{equation} |p_n - L| < q^{n^2} |p_0 - L| \end{equation} for some $q <1$.
We can easily see this behavior from the semi-Log plot
In [7]:
n = 100
iteration = [1:n];
Error = e.^(-iteration.^2);
using Gadfly
plot(x =iteration, y = Error, Scale.y_log10)
Out[7]:
If the error curve in a semi-log plot is a inverted parabola, your method has a quadratic convergence rate.
This behavior can be explained in the same way than before to have that \begin{equation} log(|p_n - L |) \lesssim n^2 \log(q) \end{equation} where the $\log(q)$ is a negative number.
Please, take in account that "convergence rates" are defined depending on the context.
For series, the convergence rate depends on the how fast the error decreases. In such cases, the linear rate for iterative method becomes exponential rate for truncated series.
As an example, in Homework 1, you showed that approximating $e$ via a Taylor expansion converged factorially fast to $e$. Using the Stirling formula that the convergence depended as $e^{-n\log{n}}$. If you would have the same error for an iterative method, that would be called superlinear.
For more examples please take a look at https://en.wikipedia.org/wiki/Rate_of_convergence.
In [ ]: