Artificial Intelligence for Humans

Introduction to the Math of Neural Networks

Understanding the Summation Operator

You will frequently summations, as shown below:

Equation 1.1: The Summation Operator $$ s = \sum_{i=1}^{10} 2i $$

If you were to write the above equation as code (using Numpy/Python) you would have the following:


In [3]:
import numpy as np

i = np.arange(1,11)  # 11, because arange is not inclusive
s = np.sum(2*i)
print(s)


110

In [ ]:
More traditional looping (non-Numpy) would perform the summation as follows:

In [4]:
s = 0
for i in range(1,11):
    s += 2*i
    
print(s)


110

Understanding the Product Operator

Equation 1.2: The Product Operator $$ s = \prod_{i=1}^{5} 2i $$


In [11]:
import numpy as np

i = np.arange(1,6)  # 6, because arange is not inclusive
s = np.prod(2*i)
print(s)


3840

In [12]:
s = 1
for i in range(1,6): # 6, because arange is not inclusive
    s *= 2*i
    
print(s)


3840

Regression and Classification


In [ ]:

Transfer Functions

Linear Transfer Function

Equation 1.4: The Linear Transfer Function $$ f(x) = x $$


In [32]:
import numpy as np
import matplotlib.pyplot as plt

def linear_transfer(t):
    return t

x = np.arange(-5.0, 5.0, 0.02)

plt.xlabel("x")
plt.ylabel("y")
plt.plot(x, linear_transfer(x), 'r')
plt.show()


Softmax Transfer Function

Equation 1.4: The Softmax Transfer Function $$ \sigma(\mathbf{z})_j = \frac{e^{z_j}}{\sum_{k=1}^K e^{z_k}} $$

Rectifier Linear Unit (ReLU) Transfer Function

Equation 1.4: The ReLU Transfer Function $$ f(x) = \max(0, x) $$


In [31]:
import numpy as np
import matplotlib.pyplot as plt

def relu_transfer(t):
    return np.maximum(0,x)

x = np.arange(-5.0, 5.0, 0.02)

plt.ylim([-2,3])
plt.xlim([-5,4])
plt.xlabel("x")
plt.ylabel("y")
plt.plot(x, relu_transfer(x), 'r')
plt.show()



In [27]:
x = np.array([1,2,3,4,5])

print(np.maximum(x,10))


[10 10 10 10 10]

Sigmoid Transfer Function

Equation 1.4: The Sigmoid Transfer Function $$ f(x) = \frac{1}{1 + e^{-x}} $$


In [33]:
import numpy as np
import matplotlib.pyplot as plt

def sigmoid_transfer(t):
    return 1.0 / (1+np.exp(-x))

x = np.arange(-5.0, 5.0, 0.02)

plt.xlabel("x")
plt.ylabel("y")
plt.plot(x, sigmoid_transfer(x), 'r')
plt.show()


Hyperbolic Transfer Function

Equation 1.3: The Hyperbolic Tangent Function

$$ f(x) = \tanh(x) $$

In [34]:
import numpy as np
import matplotlib.pyplot as plt

def tanh_transfer(t):
    return np.tanh(t)

x = np.arange(-5.0, 5.0, 0.02)

plt.xlabel("x")
plt.ylabel("y")
plt.plot(x, tanh_transfer(x), 'r')
plt.show()


$$ f(x) = \frac{1}{1 + e^{-x}} $$

Calculating a Neuron

Equation 1.2: Calculate H1 $$ h_1 = A(\sum_{c=1}^n (i_c * w_c)) $$


In [ ]:
import numpy as np

a = np.array( [1, 1, 0] ) # First 1 is the bias, 1 and 0 are the inputs