This pre-class assignment finishes out the videos from "Neural Networks Demystified" module. Please watch the videos. Again, you do not have to understand the equations but the math is included for completeness.
If you are lost, I highly recommend reviewing the entire "Neural Networks Demystified" module which can be downloaded from github:
git clone https://github.com/stephencwelch/Neural-Networks-Demystified
</p>
This assignment is due by 11:59 p.m. the day before class and should be uploaded into the appropriate "Pre-class assignments" dropbox folder in the Desire2Learn website.
In [1]:
from IPython.display import YouTubeVideo
YouTubeVideo('5u0jaA3qAGk',width=640,height=360)
Out[1]:
Question 1: In simple terms, explain the "Curse of Dimensionality"?
In order to grid search/brute force solve for the best variables for all dimensions it would take N^D operations where D is the dimension. This can quickly reach amounts of time that aren't reasonably computable.
In [2]:
YouTubeVideo('GlcnxUlrtek',width=640,height=360)
Out[2]:
Question 2: The gradient decent algorithm in Neural Networks is often called "back propagation." What is being passed back though the algorithm and causing the weights to be updated?
The weights of how much they contribute to the error. The goal is to find out where the most error came from and then change that.
Here is a link to the entire code so far:
https://raw.githubusercontent.com/stephencwelch/Neural-Networks-Demystified/master/partSix.py
✅ Do This - Download and inspect the partSix.py file and run the following command:
In [5]:
from partSix import *
print(X)
print(y)
✅ Do This - Create an instance of the Neural Network and apply forward function to estimate $\hat{y}$:
In [6]:
# put your code here
NN = Neural_Network()
NN.forward(X)
Out[6]:
Question 3: How good is this initial estimation?
In [7]:
NN.costFunction(X, y )
Out[7]:
pretty bad. The cost function gives an error of 0.241. It only guesses variables around 0.5.
In [8]:
from IPython.display import YouTubeVideo
YouTubeVideo('9KM9Td6RVgQ',width=640,height=360)
Out[8]:
✅ Do This - Now, create an instance of the trainer
class from the partSix.py file. Call the objects train
function by passing it the original X
and y
data:
In [10]:
#Put your code here
T = trainer(NN)
T.train(X,y)
In [14]:
print(NN.forward(X))
print(NN.costFunction(X, y ))
print(y)
✅ Do This - If done correctly, the NN
object should now be trained. Apply the forward function again to see the new estimation of $\hat{y}$.
Question 4: Hopefully this worked and the estimation is better than the previous one. How close are these values to the original grades? What shortcomings are there to testing using this approach?
These are very close values. I think some short comings is when there are local minimums in the data as well as when you have a low amount of data. I think it takes a large amount of data to train to be something useful and not just spit back out the same answers you trained it on.
In [15]:
from IPython.display import HTML
HTML(
"""
<iframe
src="https://goo.gl/forms/SIRHykeawcq3Ip753"
width="80%"
height="600px"
frameborder="0"
marginheight="0"
marginwidth="0">
Loading...
</iframe>
"""
)
Out[15]:
Now, you just need to submit this assignment by uploading it to the course Desire2Learn web page for today's dropbox (Don't forget to add your name in the first cell).
© Copyright 2017, Michigan State University Board of Trustees