Description of the UCI protocol: https://ucichessengine.wordpress.com/2011/03/16/description-of-uci-protocol/
Let us parse the logs first:
In [1]:
    
%pylab inline
    
    
In [2]:
    
! grep "multipv 1" log2.txt  | grep -v lowerbound | grep -v upperbound > log2_g.txt
    
In [3]:
    
def parse_info(l):
    D = {}
    k = l.split()
    i = 0
    assert k[i] == "info"
    i += 1
    while i < len(k):
        if k[i] == "depth":
            D[k[i]] = int(k[i+1])
            i += 2
        elif k[i] == "seldepth":
            D[k[i]] = int(k[i+1])
            i += 2
        elif k[i] == "multipv":
            D[k[i]] = int(k[i+1])
            i += 2
        elif k[i] == "score":
            if k[i+1] == "cp":
                D["score_p"] = int(k[i+2]) / 100. # score in pawns
            i += 3
        elif k[i] == "nodes":
            D[k[i]] = int(k[i+1])
            i += 2
        elif k[i] == "nps":
            D[k[i]] = int(k[i+1])
            i += 2
        elif k[i] == "hashfull":
            D[k[i]] = int(k[i+1]) / 1000. # between 0 and 1
            i += 2
        elif k[i] == "tbhits":
            D[k[i]] = int(k[i+1])
            i += 2
        elif k[i] == "time":
            D[k[i]] = int(k[i+1]) / 1000. # elapsed time in [s]
            i += 2
        elif k[i] == "pv":
            D[k[i]] = k[i+1:]
            return D
        else:
            raise Exception("Unknown kw")
    
In [4]:
    
# Convert to an array of lists
D = []
for l in open("log2_g.txt").readlines():
    D.append(parse_info(l))
# Convert to a list of arrays
data = {}
for key in D[-1].keys():
    d = []
    for x in D:
        if key in x:
            d.append(x[key])
        else:
            d.append(-1)
    if key != "pv":
        d = array(d)
    data[key] = d
    
The number of nodes searched depend linearly on time:
In [5]:
    
title("Number of nodes searched in time")
plot(data["time"] / 60., data["nodes"], "o")
xlabel("Time [min]")
ylabel("Nodes")
grid()
show()
    
    
So nodes per second is roughly constant:
In [6]:
    
title("Positions per second in time")
plot(data["time"] / 60., data["nps"], "o")
xlabel("Time [min]")
ylabel("Positions / s")
grid()
show()
    
    
The hashtable usage is at full capacity:
In [7]:
    
title("Hashtable usage")
hashfull = data["hashfull"]
hashfull[hashfull == -1] = 0
plot(data["time"] / 60., hashfull * 100, "o")
xlabel("Time [min]")
ylabel("Hashtable filled [%]")
grid()
show()
    
    
Number of nodes needed for the given depth grows exponentially, except for moves that are forced, which require very little nodes to search (those show as a horizontal plateau):
In [8]:
    
title("Number of nodes vs. depth")
semilogy(data["depth"], data["nodes"], "o")
xlabel("Depth [half moves]")
ylabel("Nodes")
grid()
show()
    
    
In [9]:
    
title("Score")
plot(data["depth"], data["score_p"], "o")
xlabel("Depth [half moves]")
ylabel("Score [pawns]")
grid()
show()
    
    
Convergence of the variations:
In [10]:
    
for i in range(len(data["depth"])):
    print "%2i %s" % (data["depth"][i], " ".join(data["pv"][i])[:100])