In [ ]:
## Deep Learning
"""
ReLU: Rectified Linear U.
End-to-End (no pre-training)
Data augumentation techniques

VGG: Keep the design of network simple, only focus on depth;  What is the relation between level of deep network
GoogleNet: Branching, Bottleneck, and Skip Connection

ResNet:
Challenges of depth vs complexity

SIFT/HOG
: More features --> Deeper;

Tradiitonal component: we specifically design which feature to look at
Deep Learning: no background knowledge/human concept;  Idea of Layers: Generic component for the layers

Vanishing Grandient Challenge
: Xavier Initializaion in Caffe

Batch Normalization:
Why do we have higher training error with simple stacking the network?
:  

Identity mapping for Optimiaztion

Forward Propogation and Backward Propogation


R-CNN: Object detection
What is the biggest setback to a regular R-CNN Network?  How doese Fast R-CNN fix that?
: What is R-CNN Network?  What makes it powerful?
: What is the externel module
"""