Linear-time

Pseudo-Code

The "Linear-time" maximum sub-array algorithm is described by the following pseudo-code:

LINEAR-TIME-MAX-SUBARRAY(A[1, ..., N])
    maximum sum = -Infinity
    sum ending here = -Infinity
    for i from 1 to N
        ending here high index = j
        if ending here sum > 0
            ending here sum = ending here sum + A[i]
        else
            ending here low index = j
            ending here sum = A[i]

        if ending here sum > maximum sum
            maximum sum = ending here sum
            start index = ending here low index
            end index = ending here high index

    return maximum sum, A[start index, ..., end index]

Theoretical Run-time Analysis

The $i$ loop runs from $1$ to $N$. Inside the loop are constant time operations. We can compute the number of iterations of these constant time operations as:

\begin{equation} \begin{split} \sum_{i=1}^N \Theta(1) & = N\cdot \Theta(1) \\ & = \Theta(N) \end{split} \end{equation}

Thus, the theoritical run-time is $\Theta(N)$.

Experimental Analysis

For a series of array sizes $N$, 10 random arrays were generated and run through the "Linear-time" algorithm. The CPU clock time was recorded for each of the 10 random array inputs, and an average run time was computed. Below is the plot of average run time versus $N$ for the "Linear-time" algorithm.

A curve fit was applied to the average run time data, which resulted in the following fit equation as a function of $x$ standing in for $N$:

$$y = 2.04735 * 10^{-7} * x - 1.4449$$

The best fit curve fits the plotted data extremely well, showing that the runtimes reflect a linear trend. The observed linear trend in the data matches with the theoretically derived runtime of $\theta(n)$.

Based on the average run time data curve fit, we would expect the "Linear-time" algorithm to be able to process the following number of elements in the given amount of time:

Time Max Input Size
10 seconds 55901100
30 seconds 153588000
60 seconds 300119000