Nicolas Dronchi

Day 23 Pre-Class assignment: Back propagation

This pre-class assignment finishes out the videos from "Neural Networks Demystified" module. Please watch the videos. Again, you do not have to understand the equations but the math is included for completeness.

If you are lost, I highly recommend reviewing the entire "Neural Networks Demystified" module which can be downloaded from github:

git clone https://github.com/stephencwelch/Neural-Networks-Demystified


Goals for this pre-class assignment:

</p>

  1. Reviewing gradient descent
  2. Performing Back Propagation
  3. Training at network

Assignment instructions

This assignment is due by 11:59 p.m. the day before class and should be uploaded into the appropriate "Pre-class assignments" dropbox folder in the Desire2Learn website.


Gradient Decent

✅ Do This - watch the following video:


In [1]:
from IPython.display import YouTubeVideo
YouTubeVideo('5u0jaA3qAGk',width=640,height=360)


Out[1]:

Question 1: In simple terms, explain the "Curse of Dimensionality"?

In order to grid search/brute force solve for the best variables for all dimensions it would take N^D operations where D is the dimension. This can quickly reach amounts of time that aren't reasonably computable.


2. Back Propagation:

Now watch the following video:


In [2]:
YouTubeVideo('GlcnxUlrtek',width=640,height=360)


Out[2]:

Question 2: The gradient decent algorithm in Neural Networks is often called "back propagation." What is being passed back though the algorithm and causing the weights to be updated?

The weights of how much they contribute to the error. The goal is to find out where the most error came from and then change that.

✅ Do This - Download and inspect the partSix.py file and run the following command:


In [5]:
from partSix import *
print(X)
print(y)


[[0.3 1. ]
 [0.5 0.2]
 [1.  0.4]]
[[0.75]
 [0.82]
 [0.93]]

✅ Do This - Create an instance of the Neural Network and apply forward function to estimate $\hat{y}$:


In [6]:
# put your code here
NN = Neural_Network()
NN.forward(X)


Out[6]:
array([[0.52155376],
       [0.41215014],
       [0.41576012]])

Question 3: How good is this initial estimation?


In [7]:
NN.costFunction(X, y )


Out[7]:
array([0.24148593])

pretty bad. The cost function gives an error of 0.241. It only guesses variables around 0.5.


3. Training:

Please watch the following video:


In [8]:
from IPython.display import YouTubeVideo
YouTubeVideo('9KM9Td6RVgQ',width=640,height=360)


Out[8]:

✅ Do This - Now, create an instance of the trainer class from the partSix.py file. Call the objects train function by passing it the original X and y data:


In [10]:
#Put your code here
T = trainer(NN)
T.train(X,y)


Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 36
         Function evaluations: 39
         Gradient evaluations: 39

In [14]:
print(NN.forward(X))
print(NN.costFunction(X, y ))
print(y)


[[0.74998259]
 [0.82002105]
 [0.93000224]]
[3.75622733e-10]
[[0.75]
 [0.82]
 [0.93]]

✅ Do This - If done correctly, the NN object should now be trained. Apply the forward function again to see the new estimation of $\hat{y}$.

Question 4: Hopefully this worked and the estimation is better than the previous one. How close are these values to the original grades? What shortcomings are there to testing using this approach?

These are very close values. I think some short comings is when there are local minimums in the data as well as when you have a low amount of data. I think it takes a large amount of data to train to be something useful and not just spit back out the same answers you trained it on.


Assignment wrap-up

Please fill out the form that appears when you run the code below. You must completely fill this out in order to receive credit for the assignment!


In [15]:
from IPython.display import HTML
HTML(
"""
<iframe 
	src="https://goo.gl/forms/SIRHykeawcq3Ip753" 
	width="80%" 
	height="600px" 
	frameborder="0" 
	marginheight="0" 
	marginwidth="0">
	Loading...
</iframe>
"""
)


Out[15]:
Loading...

Congratulations, you're done with your pre-class assignment!

Now, you just need to submit this assignment by uploading it to the course Desire2Learn web page for today's dropbox (Don't forget to add your name in the first cell).

© Copyright 2017, Michigan State University Board of Trustees