Estimating the Mean and Variance for our Complexity Measure

Currently, we define our complexity measure $c$ in the following way:

\begin{align} c &= \dfrac{ N_{in} \left( N_{in} - 1 \right) }{ 2 \displaystyle\sum\limits_{i=1}^{N_{in} - 1} \displaystyle\sum\limits_{j = i + 1}^{N_{in}} \vec{q_{i}} \cdot \vec{q_{j}} }. \end{align}

The current estimate for the mean of this value is based on the random processes involved in creating the output pattern matrix and is:

\begin{align} c &= \dfrac{1}{ \displaystyle\sum\limits_{m=1}^{K}{ \dfrac{m}{4} \cdot \dfrac{ \dbinom{N_{out} - K}{m} }{ \dbinom{N_{out}}{K} } } }. \end{align}

Unfortunately, the above RHS-term does not capture the random process appropriately!


In [1]:
import numpy
from meb.utils.mathfuncs import binomial_coefficient

In [4]:
def predict_complexity(n_in, n_out, k):
    """
    Compute the predicted mean and variance for a randomly set up output pattern
    with a certain parameter setting.

    Parameters
    ----------
    n_in: int
        Number of input nodes.
    n_out: int
        Number of output nodes.
    k: int
        Number of activated output nodes.
    """
    denominator = binomial_coefficient(n_out, k) * float(k)
    nominator = sum(binomial_coefficient(n_out - k, m) * m for m in range(1, k + 1))
    return numpy.reciprocal(nominator / denominator)

Prediction with $N_{out} = 8$ and $K = 4$:


In [5]:
predict_complexity(8, 8, 4)


Out[5]:
8.75

Prediction with $N_{out} = 8$ and $K = 2$:


In [6]:
predict_complexity(8, 8, 2)


Out[6]:
1.5555555555555554

Prediction with $N_{out} = 8$ and $K = 6$:


In [7]:
predict_complexity(8, 8, 6)


Out[7]:
42.0

In [14]:
def simple_prediction(n_in, n_out, k):
    p = numpy.power(float(k)/n_out, 2)
    mean = n_out * p
    var = n_out * p * (1.0 - p)
    return result

In [15]:
simple_prediction(8, 8, 4)


0.25
Out[15]:
0.086517333984375