In [5]:
# %load /Users/facaiyan/Study/book_notes/preconfig.py
%matplotlib inline

import matplotlib.pyplot as plt
import seaborn as sns
sns.set(color_codes=True)
sns.set(font='SimHei')
plt.rcParams['axes.grid'] = False

#from IPython.display import SVG
def show_image(filename, figsize=None, res_dir=True):
    if figsize:
        plt.figure(figsize=figsize)

    if res_dir:
        filename = './res/{}'.format(filename)

    plt.imshow(plt.imread(filename))

Chapter 1 Introduction


In [10]:
show_image('fig1_5.png', figsize=[12, 10])



In [8]:
show_image('fig1_4.png', figsize=[10, 8])


History:

  • distributed representation
  • back-propagation
  • long short-term memory (LSTM) network: used for many sequence modeling tasks, including natural language processing.
  • Recurrent neural networks: sequence-to-sequence learning, such as machine translation.
  • self-programming technology
  • reinforcement learning: an autonomous agent must learn to perform a task by trial and error, without any guidance from the human operator.

Trend:

  • Increasing Dataset Sizes
    1. Deep learning has become more useful as the amount of available training data has increased.
    2. Fortunately, the amount of skill required reduces as the amount of training data increases.
    3. rule of thumb for superivsed learning:
      • acceptable performance: 5000 labeled examples per catecory.
      • match or exceed human performace: at least 10 million labeled examples.
  • Increasing Model Sizes

In [11]:
show_image('fig1_11.png', figsize=[10, 8])



In [ ]: