This entire Artificial Neural Networks module is from Neural Networks Demystified by @stephencwelch. We have streamlined the content to better fit the format of the class. However, if you have questions or are just curious I highly recommend downloading everything from the following git repository. It is a great reference to have:
git clone https://github.com/stephencwelch/Neural-Networks-Demystified
</p>
This assignment is due by 11:59 p.m. the day before class and should be uploaded into the appropriate "Pre-class assignments" dropbox folder in the Desire2Learn website.
Watch the following video:
In [1]:
from IPython.display import YouTubeVideo
YouTubeVideo('bxe2T-V8XRs',width=640,height=360)
Out[1]:
We will use the data from the video above: $$X = \left[\begin{matrix} 3 & 5 \\ 5 & 1 \\ 10 & 2 \end{matrix}\right] \hspace{1cm} , \hspace{1cm}y = \left[ \begin{matrix} 75 \\ 82 \\ 93 \end{matrix}\right] $$
Create two numpy arrays to store the values of the variables $X$ and $y$, as well as their normalized counterparts $X_{norm}$ and $y_{norm}$.
In [18]:
# your code here (Note include needed libraries):
import numpy as np
import matplotlib.pyplot as plt
# input data (hours of sleep, hours of study)
X = np.array([[3,5],[5,1],[10,2]])
# normalized X
X_norm = X/np.amax(X)
# output data (test score)
y = np.array([[75],[82],[93]])
# normalized y
y_norm = y/100
Data in a neural network flows via a process called forward propagation. Watch the following video:
In [8]:
from IPython.display import YouTubeVideo
YouTubeVideo('UJwK6jAStmg',width=640,height=360, align='Center')
Out[8]:
Question 1: How many input layers, hidden layers and output layers are there in the neural network shown in the video?
In [11]:
# Put your answer here
inputLayerSize = 2
outputLayerSize = 1
hiddenLayerSize = 3
In [14]:
# your code here:
W1 = np.random.randn(inputLayerSize, hiddenLayerSize)
W2 = np.random.randn(hiddenLayerSize, outputLayerSize)
In [15]:
Z2 = np.dot(X_norm, W1)
Z2
Out[15]:
In [19]:
# your code here:
def sigmoid(z):
# apply sigmoid activation function
return 1/(1+np.exp(-z))
Test your sigmoid function using the following testing code:
In [23]:
testInput = np.arange(-6,6,0.01)
plt.plot(testInput, sigmoid(testInput), linewidth= 2)
plt.grid(1)
plt.show()
In [24]:
a2 = sigmoid(Z2)
a2
Out[24]:
In [25]:
Z3 = np.dot(a2, W2)
Z3
Out[25]:
In [27]:
# your code here:
yHat = sigmoid(Z3)
In [28]:
y_norm
Out[28]:
In [29]:
yHat
Out[29]:
Of course the results from forward propagation are terrible; no surprises here! The weights have not been properly chosen. That's what training a network does: the goal is to find a combination of weights so that the result of forward propagation fits the intended output data as best as possible.
We will be covering this topic in class.
Please go to the following website : http://playground.tensorflow.org/
There, you'll have the opportunity to play with an actual neural network (e.g., choosing its architecture and the type of activation function) for classification purpose.
In [30]:
from IPython.display import HTML
HTML(
"""
<iframe
src="https://goo.gl/forms/XqTdYAtXYDSc1R7V2"
width="80%"
height="1200px"
frameborder="0"
marginheight="0"
marginwidth="0">
Loading...
</iframe>
"""
)
Out[30]:
Now, you just need to submit this assignment by uploading it to the course Desire2Learn web page for today's dropbox (Don't forget to add your name in the first cell).
© Copyright 2017, Michigan State University Board of Trustees