Ensure plots show up in the notebook itself
In [32]:
%matplotlib inline
The entire dogscats/ directory can be downloaded from here. It has been placed in a data/ directory where this notebook is located. Add the entire "data/" directory to the .gitignore file to prevent it from getting uploaded to GitHub.
Add a line to toggle whether the whole dataset will be used or just the sample/ directory for faster prototyping.
In [33]:
#path = "data/dogscats/"
path = "data/dogscats/sample/"
In [34]:
# Handy trick to get a file link
from IPython.display import FileLink
FileLink('AF.png')
Out[34]:
LogLoss is known in Keras as binary entropy (or categorical cross-entropy)
Instead of having absolute values (0 or 1) clip the values so anything less than 0.05 becomes 0.05 and anything above 0.95 becomes 0.95 np.clip(preds[:,1], 0.05, 0.95)
After each epoch, you can save the model weights in case you end up overfitting... then you can go back to a step in the middle vgg.model.save_weights
Halfway through the epochs, change the learning rate from 0.1 to 0.01 vgg.model.optimizer.lr = 0.01
Results get more accurate and confident as you run more epochs
In [35]:
import numpy as np
from tensorflow import keras
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.optimizers import SGD
In [36]:
x = np.random.random((30,2))
y = np.dot(x, [2.,3.]) + 1.
In [37]:
x[:5]
Out[37]:
In [38]:
y[:5]
Out[38]:
In [39]:
lm = Sequential([ Dense(1, input_shape=(2,))])
In [40]:
lm.compile(optimizer=SGD(lr=0.1), loss='mse')
In [41]:
lm.evaluate(x, y, verbose=0)
Out[41]:
In [42]:
lm.fit(x, y, epochs=50, batch_size=10)
Out[42]:
In [43]:
lm.evaluate(x, y, verbose=0)
Out[43]:
In [44]:
lm.get_weights()
Out[44]:
In [45]:
lm.summary()
In [ ]: