This is notebook is a modified fork from here.
Class activation maps (CAM) is one of many ways to visualize and get insights from a convolutional neural network (CNN). In this visualization approach, a "Class Activation" heatmap is created over the input image. A "class activation" heatmap is a 2D grid of scores associated with an specific output class, computed for every location in any input image, indicating how important each location is with respect to the class considered. It is an easy mechanism to tell an observer which features the CNN model is looking for, while generating the predictions.
This notebook uses Keras and Tensorflow.
The visual summary of features using CAM is really useful for creating better explainable deep-learning models, and a lot of interesting insights can be derived from a CNN model. Some of these interesting examples are shown towards the end of this notebook.
In [1]:
from keras.applications.vgg16 import VGG16
import matplotlib.image as mpimg
from keras import backend as K
import matplotlib.pyplot as plt
%matplotlib inline
K.clear_session()
Using TensorFlow backend.
In [2]:
model_vgg16 = VGG16(weights='imagenet')
WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Colocations handled automatically by placer.
In [3]:
! wget https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/hummingbird_01.jpg -O hummingbird_01.jpg
--2019-02-21 09:28:41-- https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/hummingbird_01.jpg
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 89448 (87K) [image/jpeg]
Saving to: ‘hummingbird_01.jpg’
hummingbird_01.jpg 100%[===================>] 87.35K --.-KB/s in 0.01s
2019-02-21 09:28:41 (7.92 MB/s) - ‘hummingbird_01.jpg’ saved [89448/89448]
In [4]:
img_path = './hummingbird_01.jpg'
img=mpimg.imread(img_path)
plt.imshow(img)
Out[4]:
<matplotlib.image.AxesImage at 0x7fe29017cf60>
In [0]:
from keras.preprocessing import image
img = image.load_img(img_path, target_size=(224, 224))
In [0]:
x = image.img_to_array(img)
In [0]:
import numpy as np
x = np.expand_dims(x, axis=0)
In [8]:
x.shape
Out[8]:
(1, 224, 224, 3)
In [0]:
from keras.applications.vgg16 import preprocess_input
x = preprocess_input(x)
In [10]:
import pandas as pd
from keras.applications.vgg16 import decode_predictions
preds = model_vgg16.predict(x)
predictions = pd.DataFrame(decode_predictions(preds, top=3)[0],columns=['col1','category','probability']).iloc[:,1:]
print('PREDICTION:',predictions.loc[0,'category'])
PREDICTION: hummingbird
In [11]:
import seaborn as sns
f = sns.barplot(x='probability',y='category',data=predictions,color="red")
sns.set_style(style='white')
f.grid(False)
f.spines["top"].set_visible(False)
f.spines["right"].set_visible(False)
f.spines["bottom"].set_visible(False)
f.spines["left"].set_visible(False)
f.set_title('Top 3 Predictions:')
/usr/local/lib/python3.6/dist-packages/seaborn/categorical.py:1428: FutureWarning: remove_na is deprecated and is a private function. Do not use.
stat_data = remove_na(group_data)
Out[11]:
Text(0.5, 1.0, 'Top 3 Predictions:')
In [0]:
argmax = np.argmax(preds[0])
In [0]:
output = model_vgg16.output[:, argmax]
In [14]:
model_vgg16.summary()
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) (None, 224, 224, 3) 0
_________________________________________________________________
block1_conv1 (Conv2D) (None, 224, 224, 64) 1792
_________________________________________________________________
block1_conv2 (Conv2D) (None, 224, 224, 64) 36928
_________________________________________________________________
block1_pool (MaxPooling2D) (None, 112, 112, 64) 0
_________________________________________________________________
block2_conv1 (Conv2D) (None, 112, 112, 128) 73856
_________________________________________________________________
block2_conv2 (Conv2D) (None, 112, 112, 128) 147584
_________________________________________________________________
block2_pool (MaxPooling2D) (None, 56, 56, 128) 0
_________________________________________________________________
block3_conv1 (Conv2D) (None, 56, 56, 256) 295168
_________________________________________________________________
block3_conv2 (Conv2D) (None, 56, 56, 256) 590080
_________________________________________________________________
block3_conv3 (Conv2D) (None, 56, 56, 256) 590080
_________________________________________________________________
block3_pool (MaxPooling2D) (None, 28, 28, 256) 0
_________________________________________________________________
block4_conv1 (Conv2D) (None, 28, 28, 512) 1180160
_________________________________________________________________
block4_conv2 (Conv2D) (None, 28, 28, 512) 2359808
_________________________________________________________________
block4_conv3 (Conv2D) (None, 28, 28, 512) 2359808
_________________________________________________________________
block4_pool (MaxPooling2D) (None, 14, 14, 512) 0
_________________________________________________________________
block5_conv1 (Conv2D) (None, 14, 14, 512) 2359808
_________________________________________________________________
block5_conv2 (Conv2D) (None, 14, 14, 512) 2359808
_________________________________________________________________
block5_conv3 (Conv2D) (None, 14, 14, 512) 2359808
_________________________________________________________________
block5_pool (MaxPooling2D) (None, 7, 7, 512) 0
_________________________________________________________________
flatten (Flatten) (None, 25088) 0
_________________________________________________________________
fc1 (Dense) (None, 4096) 102764544
_________________________________________________________________
fc2 (Dense) (None, 4096) 16781312
_________________________________________________________________
predictions (Dense) (None, 1000) 4097000
=================================================================
Total params: 138,357,544
Trainable params: 138,357,544
Non-trainable params: 0
_________________________________________________________________
In [0]:
last_conv_layer = model_vgg16.get_layer('block5_conv3')
In [0]:
grads = K.gradients(output, last_conv_layer.output)[0]
In [0]:
pooled_grads = K.mean(grads, axis=(0, 1, 2))
In [0]:
iterate = K.function([model_vgg16.input], [pooled_grads, last_conv_layer.output[0]])
In [0]:
pooled_grads_value, conv_layer_output_value = iterate([x])
In [0]:
for i in range(conv_layer_output_value.shape[2]):
conv_layer_output_value[:, :, i] *= pooled_grads_value[i]
In [21]:
heatmap = np.mean(conv_layer_output_value, axis=-1)
heatmap = np.maximum(heatmap, 0)
heatmap /= np.max(heatmap)
plt.matshow(heatmap)
plt.show()
In [0]:
import cv2
img = cv2.imread(img_path)
In [0]:
heatmap = cv2.resize(heatmap, (img.shape[1], img.shape[0]))
In [0]:
heatmap = np.uint8(255 * heatmap)
In [0]:
heatmap = cv2.applyColorMap(heatmap, cv2.COLORMAP_JET)
In [0]:
hif = .8
In [0]:
superimposed_img = heatmap * hif + img
In [0]:
output = './output.jpeg'
cv2.imwrite(output, superimposed_img)
img=mpimg.imread(output)
In [0]:
from google.colab import files
files.download(output)
In [30]:
plt.imshow(img)
plt.axis('off')
plt.title(predictions.loc[0,'category'])
Out[30]:
Text(0.5, 1.0, 'hummingbird')
In [0]:
def class_activation_map(INPUT_IMG_FILE=None,
PRE_PROCESSOR=None,
LABEL_DECODER=None,
MODEL=None,
LABELS=None,
IM_WIDTH=299,
IM_HEIGHT=299,
CONV_LAYER='conv_7b',
URL_MODE=False,
FILE_MODE=True,
EVAL_STEPS=10,
HEATMAP_SHAPE=[14,14]):
if INPUT_IMG_FILE == None:
print ('No input file specified to generate predictions ...')
return
if URL_MODE:
response = requests.get(INPUT_IMG_FILE)
img = Image.open(BytesIO(response.content))
img = img.resize((IM_WIDTH, IM_HEIGHT))
elif FILE_MODE:
img = INPUT_IMG_FILE
else:
img = image.load_img(INPUT_IMG_FILE, target_size=(IM_WIDTH, IM_HEIGHT))
x = img
if not FILE_MODE:
x = image.img_to_array(img)
x = np.expand_dims(x, axis=0)
if PRE_PROCESSOR !=None:
preprocess_input = PRE_PROCESSOR
x = preprocess_input(x)
model = MODEL
if model == None:
print ('No input model specified to generate predictions ...')
return
labels = LABELS
heatmaps = []
heatmap_sum = np.empty(HEATMAP_SHAPE, float)
last_conv_layer = model.get_layer(CONV_LAYER)
feature_size = tensor_featureSizeExtractor(last_conv_layer)
for step in (range(EVAL_STEPS)):
start = time.time()
preds = model.predict(x)
probability = preds.flatten()
prediction = []
if labels !=None:
prediction = labels[np.argmax(probability)]
elif LABEL_DECODER !=None:
prediction = pd.DataFrame(LABEL_DECODER(preds, top=3)[0],columns=['col1','category','probability']).iloc[:,1:]
print('PREDICTION:',prediction.loc[0,'category'])
else:
print ('No labels will be generated ...')
accuracy = probability[np.argmax(probability)]
argmax = np.argmax(preds[0])
output = model.output[:, argmax]
grads = K.gradients(output, last_conv_layer.output)[0]
pooled_grads = K.mean(grads, axis=(0, 1, 2))
iterate = K.function([model.input], [pooled_grads, last_conv_layer.output[0]])
pooled_grads_value, conv_layer_output_value = iterate([x])
for i in range(feature_size):
conv_layer_output_value[:,:,i] *= pooled_grads_value[i]
heatmap = np.mean(conv_layer_output_value, axis=-1)
heatmap = np.maximum(heatmap, 0)
heatmap /= np.max(heatmap)
try:
heatmap_sum = np.add(heatmap_sum,
heatmap)
heatmaps.append(heatmap)
if EVAL_STEPS >1:
del (heatmap)
except:
print ('Failed updating heatmaps')
end = time.time()
execution_time = end - start
print ('Completed processing {} out of {} steps in {} seconds ...'.format(int(step+1), int(EVAL_STEPS), float(execution_time)))
if EVAL_STEPS >1:
mean_heatmap = heatmap_sum/EVAL_STEPS
else:
mean_heatmap = heatmap
return [mean_heatmap, heatmaps, preds[0], prediction, accuracy, probability]
In [0]:
def tensor_featureSizeExtractor(last_conv_layer):
if len(last_conv_layer.output.get_shape().as_list()) == 4:
feature_size = last_conv_layer.output.get_shape().as_list()[3]
return feature_size
else:
return 'Received tensor shape: {} instead of expected shape: 4'.format(len(last_conv_layer.output.get_shape().as_list()))
In [0]:
INPUT_IMG_FILE = 'https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/hummingbird_01.jpg'
CONV_LAYER = 'block5_conv3'
In [0]:
from keras.applications.vgg16 import preprocess_input as PRE_PROCESSOR
import requests
from PIL import Image
from io import BytesIO
import time
In [35]:
output = class_activation_map(INPUT_IMG_FILE,
PRE_PROCESSOR=PRE_PROCESSOR,
MODEL=model_vgg16,
LABELS=None,
IM_WIDTH=224,
IM_HEIGHT=224,
CONV_LAYER=CONV_LAYER,
EVAL_STEPS=1,
URL_MODE=True,
FILE_MODE=False)
No labels will be generated ...
Completed processing 1 out of 1 steps in 0.15799641609191895 seconds ...
In [36]:
HEATMAP = output[0]
plt.matshow(HEATMAP)
plt.show()
In [0]:
def heatmap_overlay(INPUT_IMG_FILE,
HEATMAP,
THRESHOLD=0.8):
img = cv2.imread(INPUT_IMG_FILE)
heatmap = cv2.resize(HEATMAP, (img.shape[1], img.shape[0]))
heatmap = np.uint8(255 * heatmap)
heatmap = cv2.applyColorMap(heatmap, cv2.COLORMAP_JET)
hif = THRESHOLD
superimposed_img = heatmap * hif + img
return [superimposed_img, heatmap]
In [0]:
INPUT_IMG_FILE = './hummingbird_01.jpg'
In [0]:
heatmap_output = heatmap_overlay(INPUT_IMG_FILE,
HEATMAP,
THRESHOLD=0.8)
superimposed_img = heatmap_output[0]
In [0]:
output_file = './class_activation_map.jpeg'
cv2.imwrite(output_file, superimposed_img)
img=mpimg.imread(output_file)
In [41]:
plt.imshow(img)
Out[41]:
<matplotlib.image.AxesImage at 0x7fe271c58cc0>
In [0]:
from keras.applications.inception_resnet_v2 import InceptionResNetV2
%matplotlib inline
K.clear_session()
In [0]:
model_InceptionResNetV2 = InceptionResNetV2(weights='imagenet')
In [44]:
model_InceptionResNetV2.summary()
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) (None, 299, 299, 3) 0
__________________________________________________________________________________________________
conv2d_1 (Conv2D) (None, 149, 149, 32) 864 input_1[0][0]
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, 149, 149, 32) 96 conv2d_1[0][0]
__________________________________________________________________________________________________
activation_1 (Activation) (None, 149, 149, 32) 0 batch_normalization_1[0][0]
__________________________________________________________________________________________________
conv2d_2 (Conv2D) (None, 147, 147, 32) 9216 activation_1[0][0]
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, 147, 147, 32) 96 conv2d_2[0][0]
__________________________________________________________________________________________________
activation_2 (Activation) (None, 147, 147, 32) 0 batch_normalization_2[0][0]
__________________________________________________________________________________________________
conv2d_3 (Conv2D) (None, 147, 147, 64) 18432 activation_2[0][0]
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, 147, 147, 64) 192 conv2d_3[0][0]
__________________________________________________________________________________________________
activation_3 (Activation) (None, 147, 147, 64) 0 batch_normalization_3[0][0]
__________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D) (None, 73, 73, 64) 0 activation_3[0][0]
__________________________________________________________________________________________________
conv2d_4 (Conv2D) (None, 73, 73, 80) 5120 max_pooling2d_1[0][0]
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, 73, 73, 80) 240 conv2d_4[0][0]
__________________________________________________________________________________________________
activation_4 (Activation) (None, 73, 73, 80) 0 batch_normalization_4[0][0]
__________________________________________________________________________________________________
conv2d_5 (Conv2D) (None, 71, 71, 192) 138240 activation_4[0][0]
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, 71, 71, 192) 576 conv2d_5[0][0]
__________________________________________________________________________________________________
activation_5 (Activation) (None, 71, 71, 192) 0 batch_normalization_5[0][0]
__________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D) (None, 35, 35, 192) 0 activation_5[0][0]
__________________________________________________________________________________________________
conv2d_9 (Conv2D) (None, 35, 35, 64) 12288 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, 35, 35, 64) 192 conv2d_9[0][0]
__________________________________________________________________________________________________
activation_9 (Activation) (None, 35, 35, 64) 0 batch_normalization_9[0][0]
__________________________________________________________________________________________________
conv2d_7 (Conv2D) (None, 35, 35, 48) 9216 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
conv2d_10 (Conv2D) (None, 35, 35, 96) 55296 activation_9[0][0]
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, 35, 35, 48) 144 conv2d_7[0][0]
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, 35, 35, 96) 288 conv2d_10[0][0]
__________________________________________________________________________________________________
activation_7 (Activation) (None, 35, 35, 48) 0 batch_normalization_7[0][0]
__________________________________________________________________________________________________
activation_10 (Activation) (None, 35, 35, 96) 0 batch_normalization_10[0][0]
__________________________________________________________________________________________________
average_pooling2d_1 (AveragePoo (None, 35, 35, 192) 0 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
conv2d_6 (Conv2D) (None, 35, 35, 96) 18432 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
conv2d_8 (Conv2D) (None, 35, 35, 64) 76800 activation_7[0][0]
__________________________________________________________________________________________________
conv2d_11 (Conv2D) (None, 35, 35, 96) 82944 activation_10[0][0]
__________________________________________________________________________________________________
conv2d_12 (Conv2D) (None, 35, 35, 64) 12288 average_pooling2d_1[0][0]
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, 35, 35, 96) 288 conv2d_6[0][0]
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, 35, 35, 64) 192 conv2d_8[0][0]
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, 35, 35, 96) 288 conv2d_11[0][0]
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, 35, 35, 64) 192 conv2d_12[0][0]
__________________________________________________________________________________________________
activation_6 (Activation) (None, 35, 35, 96) 0 batch_normalization_6[0][0]
__________________________________________________________________________________________________
activation_8 (Activation) (None, 35, 35, 64) 0 batch_normalization_8[0][0]
__________________________________________________________________________________________________
activation_11 (Activation) (None, 35, 35, 96) 0 batch_normalization_11[0][0]
__________________________________________________________________________________________________
activation_12 (Activation) (None, 35, 35, 64) 0 batch_normalization_12[0][0]
__________________________________________________________________________________________________
mixed_5b (Concatenate) (None, 35, 35, 320) 0 activation_6[0][0]
activation_8[0][0]
activation_11[0][0]
activation_12[0][0]
__________________________________________________________________________________________________
conv2d_16 (Conv2D) (None, 35, 35, 32) 10240 mixed_5b[0][0]
__________________________________________________________________________________________________
batch_normalization_16 (BatchNo (None, 35, 35, 32) 96 conv2d_16[0][0]
__________________________________________________________________________________________________
activation_16 (Activation) (None, 35, 35, 32) 0 batch_normalization_16[0][0]
__________________________________________________________________________________________________
conv2d_14 (Conv2D) (None, 35, 35, 32) 10240 mixed_5b[0][0]
__________________________________________________________________________________________________
conv2d_17 (Conv2D) (None, 35, 35, 48) 13824 activation_16[0][0]
__________________________________________________________________________________________________
batch_normalization_14 (BatchNo (None, 35, 35, 32) 96 conv2d_14[0][0]
__________________________________________________________________________________________________
batch_normalization_17 (BatchNo (None, 35, 35, 48) 144 conv2d_17[0][0]
__________________________________________________________________________________________________
activation_14 (Activation) (None, 35, 35, 32) 0 batch_normalization_14[0][0]
__________________________________________________________________________________________________
activation_17 (Activation) (None, 35, 35, 48) 0 batch_normalization_17[0][0]
__________________________________________________________________________________________________
conv2d_13 (Conv2D) (None, 35, 35, 32) 10240 mixed_5b[0][0]
__________________________________________________________________________________________________
conv2d_15 (Conv2D) (None, 35, 35, 32) 9216 activation_14[0][0]
__________________________________________________________________________________________________
conv2d_18 (Conv2D) (None, 35, 35, 64) 27648 activation_17[0][0]
__________________________________________________________________________________________________
batch_normalization_13 (BatchNo (None, 35, 35, 32) 96 conv2d_13[0][0]
__________________________________________________________________________________________________
batch_normalization_15 (BatchNo (None, 35, 35, 32) 96 conv2d_15[0][0]
__________________________________________________________________________________________________
batch_normalization_18 (BatchNo (None, 35, 35, 64) 192 conv2d_18[0][0]
__________________________________________________________________________________________________
activation_13 (Activation) (None, 35, 35, 32) 0 batch_normalization_13[0][0]
__________________________________________________________________________________________________
activation_15 (Activation) (None, 35, 35, 32) 0 batch_normalization_15[0][0]
__________________________________________________________________________________________________
activation_18 (Activation) (None, 35, 35, 64) 0 batch_normalization_18[0][0]
__________________________________________________________________________________________________
block35_1_mixed (Concatenate) (None, 35, 35, 128) 0 activation_13[0][0]
activation_15[0][0]
activation_18[0][0]
__________________________________________________________________________________________________
block35_1_conv (Conv2D) (None, 35, 35, 320) 41280 block35_1_mixed[0][0]
__________________________________________________________________________________________________
block35_1 (Lambda) (None, 35, 35, 320) 0 mixed_5b[0][0]
block35_1_conv[0][0]
__________________________________________________________________________________________________
block35_1_ac (Activation) (None, 35, 35, 320) 0 block35_1[0][0]
__________________________________________________________________________________________________
conv2d_22 (Conv2D) (None, 35, 35, 32) 10240 block35_1_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, 35, 35, 32) 96 conv2d_22[0][0]
__________________________________________________________________________________________________
activation_22 (Activation) (None, 35, 35, 32) 0 batch_normalization_22[0][0]
__________________________________________________________________________________________________
conv2d_20 (Conv2D) (None, 35, 35, 32) 10240 block35_1_ac[0][0]
__________________________________________________________________________________________________
conv2d_23 (Conv2D) (None, 35, 35, 48) 13824 activation_22[0][0]
__________________________________________________________________________________________________
batch_normalization_20 (BatchNo (None, 35, 35, 32) 96 conv2d_20[0][0]
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, 35, 35, 48) 144 conv2d_23[0][0]
__________________________________________________________________________________________________
activation_20 (Activation) (None, 35, 35, 32) 0 batch_normalization_20[0][0]
__________________________________________________________________________________________________
activation_23 (Activation) (None, 35, 35, 48) 0 batch_normalization_23[0][0]
__________________________________________________________________________________________________
conv2d_19 (Conv2D) (None, 35, 35, 32) 10240 block35_1_ac[0][0]
__________________________________________________________________________________________________
conv2d_21 (Conv2D) (None, 35, 35, 32) 9216 activation_20[0][0]
__________________________________________________________________________________________________
conv2d_24 (Conv2D) (None, 35, 35, 64) 27648 activation_23[0][0]
__________________________________________________________________________________________________
batch_normalization_19 (BatchNo (None, 35, 35, 32) 96 conv2d_19[0][0]
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, 35, 35, 32) 96 conv2d_21[0][0]
__________________________________________________________________________________________________
batch_normalization_24 (BatchNo (None, 35, 35, 64) 192 conv2d_24[0][0]
__________________________________________________________________________________________________
activation_19 (Activation) (None, 35, 35, 32) 0 batch_normalization_19[0][0]
__________________________________________________________________________________________________
activation_21 (Activation) (None, 35, 35, 32) 0 batch_normalization_21[0][0]
__________________________________________________________________________________________________
activation_24 (Activation) (None, 35, 35, 64) 0 batch_normalization_24[0][0]
__________________________________________________________________________________________________
block35_2_mixed (Concatenate) (None, 35, 35, 128) 0 activation_19[0][0]
activation_21[0][0]
activation_24[0][0]
__________________________________________________________________________________________________
block35_2_conv (Conv2D) (None, 35, 35, 320) 41280 block35_2_mixed[0][0]
__________________________________________________________________________________________________
block35_2 (Lambda) (None, 35, 35, 320) 0 block35_1_ac[0][0]
block35_2_conv[0][0]
__________________________________________________________________________________________________
block35_2_ac (Activation) (None, 35, 35, 320) 0 block35_2[0][0]
__________________________________________________________________________________________________
conv2d_28 (Conv2D) (None, 35, 35, 32) 10240 block35_2_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_28 (BatchNo (None, 35, 35, 32) 96 conv2d_28[0][0]
__________________________________________________________________________________________________
activation_28 (Activation) (None, 35, 35, 32) 0 batch_normalization_28[0][0]
__________________________________________________________________________________________________
conv2d_26 (Conv2D) (None, 35, 35, 32) 10240 block35_2_ac[0][0]
__________________________________________________________________________________________________
conv2d_29 (Conv2D) (None, 35, 35, 48) 13824 activation_28[0][0]
__________________________________________________________________________________________________
batch_normalization_26 (BatchNo (None, 35, 35, 32) 96 conv2d_26[0][0]
__________________________________________________________________________________________________
batch_normalization_29 (BatchNo (None, 35, 35, 48) 144 conv2d_29[0][0]
__________________________________________________________________________________________________
activation_26 (Activation) (None, 35, 35, 32) 0 batch_normalization_26[0][0]
__________________________________________________________________________________________________
activation_29 (Activation) (None, 35, 35, 48) 0 batch_normalization_29[0][0]
__________________________________________________________________________________________________
conv2d_25 (Conv2D) (None, 35, 35, 32) 10240 block35_2_ac[0][0]
__________________________________________________________________________________________________
conv2d_27 (Conv2D) (None, 35, 35, 32) 9216 activation_26[0][0]
__________________________________________________________________________________________________
conv2d_30 (Conv2D) (None, 35, 35, 64) 27648 activation_29[0][0]
__________________________________________________________________________________________________
batch_normalization_25 (BatchNo (None, 35, 35, 32) 96 conv2d_25[0][0]
__________________________________________________________________________________________________
batch_normalization_27 (BatchNo (None, 35, 35, 32) 96 conv2d_27[0][0]
__________________________________________________________________________________________________
batch_normalization_30 (BatchNo (None, 35, 35, 64) 192 conv2d_30[0][0]
__________________________________________________________________________________________________
activation_25 (Activation) (None, 35, 35, 32) 0 batch_normalization_25[0][0]
__________________________________________________________________________________________________
activation_27 (Activation) (None, 35, 35, 32) 0 batch_normalization_27[0][0]
__________________________________________________________________________________________________
activation_30 (Activation) (None, 35, 35, 64) 0 batch_normalization_30[0][0]
__________________________________________________________________________________________________
block35_3_mixed (Concatenate) (None, 35, 35, 128) 0 activation_25[0][0]
activation_27[0][0]
activation_30[0][0]
__________________________________________________________________________________________________
block35_3_conv (Conv2D) (None, 35, 35, 320) 41280 block35_3_mixed[0][0]
__________________________________________________________________________________________________
block35_3 (Lambda) (None, 35, 35, 320) 0 block35_2_ac[0][0]
block35_3_conv[0][0]
__________________________________________________________________________________________________
block35_3_ac (Activation) (None, 35, 35, 320) 0 block35_3[0][0]
__________________________________________________________________________________________________
conv2d_34 (Conv2D) (None, 35, 35, 32) 10240 block35_3_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_34 (BatchNo (None, 35, 35, 32) 96 conv2d_34[0][0]
__________________________________________________________________________________________________
activation_34 (Activation) (None, 35, 35, 32) 0 batch_normalization_34[0][0]
__________________________________________________________________________________________________
conv2d_32 (Conv2D) (None, 35, 35, 32) 10240 block35_3_ac[0][0]
__________________________________________________________________________________________________
conv2d_35 (Conv2D) (None, 35, 35, 48) 13824 activation_34[0][0]
__________________________________________________________________________________________________
batch_normalization_32 (BatchNo (None, 35, 35, 32) 96 conv2d_32[0][0]
__________________________________________________________________________________________________
batch_normalization_35 (BatchNo (None, 35, 35, 48) 144 conv2d_35[0][0]
__________________________________________________________________________________________________
activation_32 (Activation) (None, 35, 35, 32) 0 batch_normalization_32[0][0]
__________________________________________________________________________________________________
activation_35 (Activation) (None, 35, 35, 48) 0 batch_normalization_35[0][0]
__________________________________________________________________________________________________
conv2d_31 (Conv2D) (None, 35, 35, 32) 10240 block35_3_ac[0][0]
__________________________________________________________________________________________________
conv2d_33 (Conv2D) (None, 35, 35, 32) 9216 activation_32[0][0]
__________________________________________________________________________________________________
conv2d_36 (Conv2D) (None, 35, 35, 64) 27648 activation_35[0][0]
__________________________________________________________________________________________________
batch_normalization_31 (BatchNo (None, 35, 35, 32) 96 conv2d_31[0][0]
__________________________________________________________________________________________________
batch_normalization_33 (BatchNo (None, 35, 35, 32) 96 conv2d_33[0][0]
__________________________________________________________________________________________________
batch_normalization_36 (BatchNo (None, 35, 35, 64) 192 conv2d_36[0][0]
__________________________________________________________________________________________________
activation_31 (Activation) (None, 35, 35, 32) 0 batch_normalization_31[0][0]
__________________________________________________________________________________________________
activation_33 (Activation) (None, 35, 35, 32) 0 batch_normalization_33[0][0]
__________________________________________________________________________________________________
activation_36 (Activation) (None, 35, 35, 64) 0 batch_normalization_36[0][0]
__________________________________________________________________________________________________
block35_4_mixed (Concatenate) (None, 35, 35, 128) 0 activation_31[0][0]
activation_33[0][0]
activation_36[0][0]
__________________________________________________________________________________________________
block35_4_conv (Conv2D) (None, 35, 35, 320) 41280 block35_4_mixed[0][0]
__________________________________________________________________________________________________
block35_4 (Lambda) (None, 35, 35, 320) 0 block35_3_ac[0][0]
block35_4_conv[0][0]
__________________________________________________________________________________________________
block35_4_ac (Activation) (None, 35, 35, 320) 0 block35_4[0][0]
__________________________________________________________________________________________________
conv2d_40 (Conv2D) (None, 35, 35, 32) 10240 block35_4_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_40 (BatchNo (None, 35, 35, 32) 96 conv2d_40[0][0]
__________________________________________________________________________________________________
activation_40 (Activation) (None, 35, 35, 32) 0 batch_normalization_40[0][0]
__________________________________________________________________________________________________
conv2d_38 (Conv2D) (None, 35, 35, 32) 10240 block35_4_ac[0][0]
__________________________________________________________________________________________________
conv2d_41 (Conv2D) (None, 35, 35, 48) 13824 activation_40[0][0]
__________________________________________________________________________________________________
batch_normalization_38 (BatchNo (None, 35, 35, 32) 96 conv2d_38[0][0]
__________________________________________________________________________________________________
batch_normalization_41 (BatchNo (None, 35, 35, 48) 144 conv2d_41[0][0]
__________________________________________________________________________________________________
activation_38 (Activation) (None, 35, 35, 32) 0 batch_normalization_38[0][0]
__________________________________________________________________________________________________
activation_41 (Activation) (None, 35, 35, 48) 0 batch_normalization_41[0][0]
__________________________________________________________________________________________________
conv2d_37 (Conv2D) (None, 35, 35, 32) 10240 block35_4_ac[0][0]
__________________________________________________________________________________________________
conv2d_39 (Conv2D) (None, 35, 35, 32) 9216 activation_38[0][0]
__________________________________________________________________________________________________
conv2d_42 (Conv2D) (None, 35, 35, 64) 27648 activation_41[0][0]
__________________________________________________________________________________________________
batch_normalization_37 (BatchNo (None, 35, 35, 32) 96 conv2d_37[0][0]
__________________________________________________________________________________________________
batch_normalization_39 (BatchNo (None, 35, 35, 32) 96 conv2d_39[0][0]
__________________________________________________________________________________________________
batch_normalization_42 (BatchNo (None, 35, 35, 64) 192 conv2d_42[0][0]
__________________________________________________________________________________________________
activation_37 (Activation) (None, 35, 35, 32) 0 batch_normalization_37[0][0]
__________________________________________________________________________________________________
activation_39 (Activation) (None, 35, 35, 32) 0 batch_normalization_39[0][0]
__________________________________________________________________________________________________
activation_42 (Activation) (None, 35, 35, 64) 0 batch_normalization_42[0][0]
__________________________________________________________________________________________________
block35_5_mixed (Concatenate) (None, 35, 35, 128) 0 activation_37[0][0]
activation_39[0][0]
activation_42[0][0]
__________________________________________________________________________________________________
block35_5_conv (Conv2D) (None, 35, 35, 320) 41280 block35_5_mixed[0][0]
__________________________________________________________________________________________________
block35_5 (Lambda) (None, 35, 35, 320) 0 block35_4_ac[0][0]
block35_5_conv[0][0]
__________________________________________________________________________________________________
block35_5_ac (Activation) (None, 35, 35, 320) 0 block35_5[0][0]
__________________________________________________________________________________________________
conv2d_46 (Conv2D) (None, 35, 35, 32) 10240 block35_5_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_46 (BatchNo (None, 35, 35, 32) 96 conv2d_46[0][0]
__________________________________________________________________________________________________
activation_46 (Activation) (None, 35, 35, 32) 0 batch_normalization_46[0][0]
__________________________________________________________________________________________________
conv2d_44 (Conv2D) (None, 35, 35, 32) 10240 block35_5_ac[0][0]
__________________________________________________________________________________________________
conv2d_47 (Conv2D) (None, 35, 35, 48) 13824 activation_46[0][0]
__________________________________________________________________________________________________
batch_normalization_44 (BatchNo (None, 35, 35, 32) 96 conv2d_44[0][0]
__________________________________________________________________________________________________
batch_normalization_47 (BatchNo (None, 35, 35, 48) 144 conv2d_47[0][0]
__________________________________________________________________________________________________
activation_44 (Activation) (None, 35, 35, 32) 0 batch_normalization_44[0][0]
__________________________________________________________________________________________________
activation_47 (Activation) (None, 35, 35, 48) 0 batch_normalization_47[0][0]
__________________________________________________________________________________________________
conv2d_43 (Conv2D) (None, 35, 35, 32) 10240 block35_5_ac[0][0]
__________________________________________________________________________________________________
conv2d_45 (Conv2D) (None, 35, 35, 32) 9216 activation_44[0][0]
__________________________________________________________________________________________________
conv2d_48 (Conv2D) (None, 35, 35, 64) 27648 activation_47[0][0]
__________________________________________________________________________________________________
batch_normalization_43 (BatchNo (None, 35, 35, 32) 96 conv2d_43[0][0]
__________________________________________________________________________________________________
batch_normalization_45 (BatchNo (None, 35, 35, 32) 96 conv2d_45[0][0]
__________________________________________________________________________________________________
batch_normalization_48 (BatchNo (None, 35, 35, 64) 192 conv2d_48[0][0]
__________________________________________________________________________________________________
activation_43 (Activation) (None, 35, 35, 32) 0 batch_normalization_43[0][0]
__________________________________________________________________________________________________
activation_45 (Activation) (None, 35, 35, 32) 0 batch_normalization_45[0][0]
__________________________________________________________________________________________________
activation_48 (Activation) (None, 35, 35, 64) 0 batch_normalization_48[0][0]
__________________________________________________________________________________________________
block35_6_mixed (Concatenate) (None, 35, 35, 128) 0 activation_43[0][0]
activation_45[0][0]
activation_48[0][0]
__________________________________________________________________________________________________
block35_6_conv (Conv2D) (None, 35, 35, 320) 41280 block35_6_mixed[0][0]
__________________________________________________________________________________________________
block35_6 (Lambda) (None, 35, 35, 320) 0 block35_5_ac[0][0]
block35_6_conv[0][0]
__________________________________________________________________________________________________
block35_6_ac (Activation) (None, 35, 35, 320) 0 block35_6[0][0]
__________________________________________________________________________________________________
conv2d_52 (Conv2D) (None, 35, 35, 32) 10240 block35_6_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_52 (BatchNo (None, 35, 35, 32) 96 conv2d_52[0][0]
__________________________________________________________________________________________________
activation_52 (Activation) (None, 35, 35, 32) 0 batch_normalization_52[0][0]
__________________________________________________________________________________________________
conv2d_50 (Conv2D) (None, 35, 35, 32) 10240 block35_6_ac[0][0]
__________________________________________________________________________________________________
conv2d_53 (Conv2D) (None, 35, 35, 48) 13824 activation_52[0][0]
__________________________________________________________________________________________________
batch_normalization_50 (BatchNo (None, 35, 35, 32) 96 conv2d_50[0][0]
__________________________________________________________________________________________________
batch_normalization_53 (BatchNo (None, 35, 35, 48) 144 conv2d_53[0][0]
__________________________________________________________________________________________________
activation_50 (Activation) (None, 35, 35, 32) 0 batch_normalization_50[0][0]
__________________________________________________________________________________________________
activation_53 (Activation) (None, 35, 35, 48) 0 batch_normalization_53[0][0]
__________________________________________________________________________________________________
conv2d_49 (Conv2D) (None, 35, 35, 32) 10240 block35_6_ac[0][0]
__________________________________________________________________________________________________
conv2d_51 (Conv2D) (None, 35, 35, 32) 9216 activation_50[0][0]
__________________________________________________________________________________________________
conv2d_54 (Conv2D) (None, 35, 35, 64) 27648 activation_53[0][0]
__________________________________________________________________________________________________
batch_normalization_49 (BatchNo (None, 35, 35, 32) 96 conv2d_49[0][0]
__________________________________________________________________________________________________
batch_normalization_51 (BatchNo (None, 35, 35, 32) 96 conv2d_51[0][0]
__________________________________________________________________________________________________
batch_normalization_54 (BatchNo (None, 35, 35, 64) 192 conv2d_54[0][0]
__________________________________________________________________________________________________
activation_49 (Activation) (None, 35, 35, 32) 0 batch_normalization_49[0][0]
__________________________________________________________________________________________________
activation_51 (Activation) (None, 35, 35, 32) 0 batch_normalization_51[0][0]
__________________________________________________________________________________________________
activation_54 (Activation) (None, 35, 35, 64) 0 batch_normalization_54[0][0]
__________________________________________________________________________________________________
block35_7_mixed (Concatenate) (None, 35, 35, 128) 0 activation_49[0][0]
activation_51[0][0]
activation_54[0][0]
__________________________________________________________________________________________________
block35_7_conv (Conv2D) (None, 35, 35, 320) 41280 block35_7_mixed[0][0]
__________________________________________________________________________________________________
block35_7 (Lambda) (None, 35, 35, 320) 0 block35_6_ac[0][0]
block35_7_conv[0][0]
__________________________________________________________________________________________________
block35_7_ac (Activation) (None, 35, 35, 320) 0 block35_7[0][0]
__________________________________________________________________________________________________
conv2d_58 (Conv2D) (None, 35, 35, 32) 10240 block35_7_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_58 (BatchNo (None, 35, 35, 32) 96 conv2d_58[0][0]
__________________________________________________________________________________________________
activation_58 (Activation) (None, 35, 35, 32) 0 batch_normalization_58[0][0]
__________________________________________________________________________________________________
conv2d_56 (Conv2D) (None, 35, 35, 32) 10240 block35_7_ac[0][0]
__________________________________________________________________________________________________
conv2d_59 (Conv2D) (None, 35, 35, 48) 13824 activation_58[0][0]
__________________________________________________________________________________________________
batch_normalization_56 (BatchNo (None, 35, 35, 32) 96 conv2d_56[0][0]
__________________________________________________________________________________________________
batch_normalization_59 (BatchNo (None, 35, 35, 48) 144 conv2d_59[0][0]
__________________________________________________________________________________________________
activation_56 (Activation) (None, 35, 35, 32) 0 batch_normalization_56[0][0]
__________________________________________________________________________________________________
activation_59 (Activation) (None, 35, 35, 48) 0 batch_normalization_59[0][0]
__________________________________________________________________________________________________
conv2d_55 (Conv2D) (None, 35, 35, 32) 10240 block35_7_ac[0][0]
__________________________________________________________________________________________________
conv2d_57 (Conv2D) (None, 35, 35, 32) 9216 activation_56[0][0]
__________________________________________________________________________________________________
conv2d_60 (Conv2D) (None, 35, 35, 64) 27648 activation_59[0][0]
__________________________________________________________________________________________________
batch_normalization_55 (BatchNo (None, 35, 35, 32) 96 conv2d_55[0][0]
__________________________________________________________________________________________________
batch_normalization_57 (BatchNo (None, 35, 35, 32) 96 conv2d_57[0][0]
__________________________________________________________________________________________________
batch_normalization_60 (BatchNo (None, 35, 35, 64) 192 conv2d_60[0][0]
__________________________________________________________________________________________________
activation_55 (Activation) (None, 35, 35, 32) 0 batch_normalization_55[0][0]
__________________________________________________________________________________________________
activation_57 (Activation) (None, 35, 35, 32) 0 batch_normalization_57[0][0]
__________________________________________________________________________________________________
activation_60 (Activation) (None, 35, 35, 64) 0 batch_normalization_60[0][0]
__________________________________________________________________________________________________
block35_8_mixed (Concatenate) (None, 35, 35, 128) 0 activation_55[0][0]
activation_57[0][0]
activation_60[0][0]
__________________________________________________________________________________________________
block35_8_conv (Conv2D) (None, 35, 35, 320) 41280 block35_8_mixed[0][0]
__________________________________________________________________________________________________
block35_8 (Lambda) (None, 35, 35, 320) 0 block35_7_ac[0][0]
block35_8_conv[0][0]
__________________________________________________________________________________________________
block35_8_ac (Activation) (None, 35, 35, 320) 0 block35_8[0][0]
__________________________________________________________________________________________________
conv2d_64 (Conv2D) (None, 35, 35, 32) 10240 block35_8_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_64 (BatchNo (None, 35, 35, 32) 96 conv2d_64[0][0]
__________________________________________________________________________________________________
activation_64 (Activation) (None, 35, 35, 32) 0 batch_normalization_64[0][0]
__________________________________________________________________________________________________
conv2d_62 (Conv2D) (None, 35, 35, 32) 10240 block35_8_ac[0][0]
__________________________________________________________________________________________________
conv2d_65 (Conv2D) (None, 35, 35, 48) 13824 activation_64[0][0]
__________________________________________________________________________________________________
batch_normalization_62 (BatchNo (None, 35, 35, 32) 96 conv2d_62[0][0]
__________________________________________________________________________________________________
batch_normalization_65 (BatchNo (None, 35, 35, 48) 144 conv2d_65[0][0]
__________________________________________________________________________________________________
activation_62 (Activation) (None, 35, 35, 32) 0 batch_normalization_62[0][0]
__________________________________________________________________________________________________
activation_65 (Activation) (None, 35, 35, 48) 0 batch_normalization_65[0][0]
__________________________________________________________________________________________________
conv2d_61 (Conv2D) (None, 35, 35, 32) 10240 block35_8_ac[0][0]
__________________________________________________________________________________________________
conv2d_63 (Conv2D) (None, 35, 35, 32) 9216 activation_62[0][0]
__________________________________________________________________________________________________
conv2d_66 (Conv2D) (None, 35, 35, 64) 27648 activation_65[0][0]
__________________________________________________________________________________________________
batch_normalization_61 (BatchNo (None, 35, 35, 32) 96 conv2d_61[0][0]
__________________________________________________________________________________________________
batch_normalization_63 (BatchNo (None, 35, 35, 32) 96 conv2d_63[0][0]
__________________________________________________________________________________________________
batch_normalization_66 (BatchNo (None, 35, 35, 64) 192 conv2d_66[0][0]
__________________________________________________________________________________________________
activation_61 (Activation) (None, 35, 35, 32) 0 batch_normalization_61[0][0]
__________________________________________________________________________________________________
activation_63 (Activation) (None, 35, 35, 32) 0 batch_normalization_63[0][0]
__________________________________________________________________________________________________
activation_66 (Activation) (None, 35, 35, 64) 0 batch_normalization_66[0][0]
__________________________________________________________________________________________________
block35_9_mixed (Concatenate) (None, 35, 35, 128) 0 activation_61[0][0]
activation_63[0][0]
activation_66[0][0]
__________________________________________________________________________________________________
block35_9_conv (Conv2D) (None, 35, 35, 320) 41280 block35_9_mixed[0][0]
__________________________________________________________________________________________________
block35_9 (Lambda) (None, 35, 35, 320) 0 block35_8_ac[0][0]
block35_9_conv[0][0]
__________________________________________________________________________________________________
block35_9_ac (Activation) (None, 35, 35, 320) 0 block35_9[0][0]
__________________________________________________________________________________________________
conv2d_70 (Conv2D) (None, 35, 35, 32) 10240 block35_9_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_70 (BatchNo (None, 35, 35, 32) 96 conv2d_70[0][0]
__________________________________________________________________________________________________
activation_70 (Activation) (None, 35, 35, 32) 0 batch_normalization_70[0][0]
__________________________________________________________________________________________________
conv2d_68 (Conv2D) (None, 35, 35, 32) 10240 block35_9_ac[0][0]
__________________________________________________________________________________________________
conv2d_71 (Conv2D) (None, 35, 35, 48) 13824 activation_70[0][0]
__________________________________________________________________________________________________
batch_normalization_68 (BatchNo (None, 35, 35, 32) 96 conv2d_68[0][0]
__________________________________________________________________________________________________
batch_normalization_71 (BatchNo (None, 35, 35, 48) 144 conv2d_71[0][0]
__________________________________________________________________________________________________
activation_68 (Activation) (None, 35, 35, 32) 0 batch_normalization_68[0][0]
__________________________________________________________________________________________________
activation_71 (Activation) (None, 35, 35, 48) 0 batch_normalization_71[0][0]
__________________________________________________________________________________________________
conv2d_67 (Conv2D) (None, 35, 35, 32) 10240 block35_9_ac[0][0]
__________________________________________________________________________________________________
conv2d_69 (Conv2D) (None, 35, 35, 32) 9216 activation_68[0][0]
__________________________________________________________________________________________________
conv2d_72 (Conv2D) (None, 35, 35, 64) 27648 activation_71[0][0]
__________________________________________________________________________________________________
batch_normalization_67 (BatchNo (None, 35, 35, 32) 96 conv2d_67[0][0]
__________________________________________________________________________________________________
batch_normalization_69 (BatchNo (None, 35, 35, 32) 96 conv2d_69[0][0]
__________________________________________________________________________________________________
batch_normalization_72 (BatchNo (None, 35, 35, 64) 192 conv2d_72[0][0]
__________________________________________________________________________________________________
activation_67 (Activation) (None, 35, 35, 32) 0 batch_normalization_67[0][0]
__________________________________________________________________________________________________
activation_69 (Activation) (None, 35, 35, 32) 0 batch_normalization_69[0][0]
__________________________________________________________________________________________________
activation_72 (Activation) (None, 35, 35, 64) 0 batch_normalization_72[0][0]
__________________________________________________________________________________________________
block35_10_mixed (Concatenate) (None, 35, 35, 128) 0 activation_67[0][0]
activation_69[0][0]
activation_72[0][0]
__________________________________________________________________________________________________
block35_10_conv (Conv2D) (None, 35, 35, 320) 41280 block35_10_mixed[0][0]
__________________________________________________________________________________________________
block35_10 (Lambda) (None, 35, 35, 320) 0 block35_9_ac[0][0]
block35_10_conv[0][0]
__________________________________________________________________________________________________
block35_10_ac (Activation) (None, 35, 35, 320) 0 block35_10[0][0]
__________________________________________________________________________________________________
conv2d_74 (Conv2D) (None, 35, 35, 256) 81920 block35_10_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_74 (BatchNo (None, 35, 35, 256) 768 conv2d_74[0][0]
__________________________________________________________________________________________________
activation_74 (Activation) (None, 35, 35, 256) 0 batch_normalization_74[0][0]
__________________________________________________________________________________________________
conv2d_75 (Conv2D) (None, 35, 35, 256) 589824 activation_74[0][0]
__________________________________________________________________________________________________
batch_normalization_75 (BatchNo (None, 35, 35, 256) 768 conv2d_75[0][0]
__________________________________________________________________________________________________
activation_75 (Activation) (None, 35, 35, 256) 0 batch_normalization_75[0][0]
__________________________________________________________________________________________________
conv2d_73 (Conv2D) (None, 17, 17, 384) 1105920 block35_10_ac[0][0]
__________________________________________________________________________________________________
conv2d_76 (Conv2D) (None, 17, 17, 384) 884736 activation_75[0][0]
__________________________________________________________________________________________________
batch_normalization_73 (BatchNo (None, 17, 17, 384) 1152 conv2d_73[0][0]
__________________________________________________________________________________________________
batch_normalization_76 (BatchNo (None, 17, 17, 384) 1152 conv2d_76[0][0]
__________________________________________________________________________________________________
activation_73 (Activation) (None, 17, 17, 384) 0 batch_normalization_73[0][0]
__________________________________________________________________________________________________
activation_76 (Activation) (None, 17, 17, 384) 0 batch_normalization_76[0][0]
__________________________________________________________________________________________________
max_pooling2d_3 (MaxPooling2D) (None, 17, 17, 320) 0 block35_10_ac[0][0]
__________________________________________________________________________________________________
mixed_6a (Concatenate) (None, 17, 17, 1088) 0 activation_73[0][0]
activation_76[0][0]
max_pooling2d_3[0][0]
__________________________________________________________________________________________________
conv2d_78 (Conv2D) (None, 17, 17, 128) 139264 mixed_6a[0][0]
__________________________________________________________________________________________________
batch_normalization_78 (BatchNo (None, 17, 17, 128) 384 conv2d_78[0][0]
__________________________________________________________________________________________________
activation_78 (Activation) (None, 17, 17, 128) 0 batch_normalization_78[0][0]
__________________________________________________________________________________________________
conv2d_79 (Conv2D) (None, 17, 17, 160) 143360 activation_78[0][0]
__________________________________________________________________________________________________
batch_normalization_79 (BatchNo (None, 17, 17, 160) 480 conv2d_79[0][0]
__________________________________________________________________________________________________
activation_79 (Activation) (None, 17, 17, 160) 0 batch_normalization_79[0][0]
__________________________________________________________________________________________________
conv2d_77 (Conv2D) (None, 17, 17, 192) 208896 mixed_6a[0][0]
__________________________________________________________________________________________________
conv2d_80 (Conv2D) (None, 17, 17, 192) 215040 activation_79[0][0]
__________________________________________________________________________________________________
batch_normalization_77 (BatchNo (None, 17, 17, 192) 576 conv2d_77[0][0]
__________________________________________________________________________________________________
batch_normalization_80 (BatchNo (None, 17, 17, 192) 576 conv2d_80[0][0]
__________________________________________________________________________________________________
activation_77 (Activation) (None, 17, 17, 192) 0 batch_normalization_77[0][0]
__________________________________________________________________________________________________
activation_80 (Activation) (None, 17, 17, 192) 0 batch_normalization_80[0][0]
__________________________________________________________________________________________________
block17_1_mixed (Concatenate) (None, 17, 17, 384) 0 activation_77[0][0]
activation_80[0][0]
__________________________________________________________________________________________________
block17_1_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_1_mixed[0][0]
__________________________________________________________________________________________________
block17_1 (Lambda) (None, 17, 17, 1088) 0 mixed_6a[0][0]
block17_1_conv[0][0]
__________________________________________________________________________________________________
block17_1_ac (Activation) (None, 17, 17, 1088) 0 block17_1[0][0]
__________________________________________________________________________________________________
conv2d_82 (Conv2D) (None, 17, 17, 128) 139264 block17_1_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_82 (BatchNo (None, 17, 17, 128) 384 conv2d_82[0][0]
__________________________________________________________________________________________________
activation_82 (Activation) (None, 17, 17, 128) 0 batch_normalization_82[0][0]
__________________________________________________________________________________________________
conv2d_83 (Conv2D) (None, 17, 17, 160) 143360 activation_82[0][0]
__________________________________________________________________________________________________
batch_normalization_83 (BatchNo (None, 17, 17, 160) 480 conv2d_83[0][0]
__________________________________________________________________________________________________
activation_83 (Activation) (None, 17, 17, 160) 0 batch_normalization_83[0][0]
__________________________________________________________________________________________________
conv2d_81 (Conv2D) (None, 17, 17, 192) 208896 block17_1_ac[0][0]
__________________________________________________________________________________________________
conv2d_84 (Conv2D) (None, 17, 17, 192) 215040 activation_83[0][0]
__________________________________________________________________________________________________
batch_normalization_81 (BatchNo (None, 17, 17, 192) 576 conv2d_81[0][0]
__________________________________________________________________________________________________
batch_normalization_84 (BatchNo (None, 17, 17, 192) 576 conv2d_84[0][0]
__________________________________________________________________________________________________
activation_81 (Activation) (None, 17, 17, 192) 0 batch_normalization_81[0][0]
__________________________________________________________________________________________________
activation_84 (Activation) (None, 17, 17, 192) 0 batch_normalization_84[0][0]
__________________________________________________________________________________________________
block17_2_mixed (Concatenate) (None, 17, 17, 384) 0 activation_81[0][0]
activation_84[0][0]
__________________________________________________________________________________________________
block17_2_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_2_mixed[0][0]
__________________________________________________________________________________________________
block17_2 (Lambda) (None, 17, 17, 1088) 0 block17_1_ac[0][0]
block17_2_conv[0][0]
__________________________________________________________________________________________________
block17_2_ac (Activation) (None, 17, 17, 1088) 0 block17_2[0][0]
__________________________________________________________________________________________________
conv2d_86 (Conv2D) (None, 17, 17, 128) 139264 block17_2_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_86 (BatchNo (None, 17, 17, 128) 384 conv2d_86[0][0]
__________________________________________________________________________________________________
activation_86 (Activation) (None, 17, 17, 128) 0 batch_normalization_86[0][0]
__________________________________________________________________________________________________
conv2d_87 (Conv2D) (None, 17, 17, 160) 143360 activation_86[0][0]
__________________________________________________________________________________________________
batch_normalization_87 (BatchNo (None, 17, 17, 160) 480 conv2d_87[0][0]
__________________________________________________________________________________________________
activation_87 (Activation) (None, 17, 17, 160) 0 batch_normalization_87[0][0]
__________________________________________________________________________________________________
conv2d_85 (Conv2D) (None, 17, 17, 192) 208896 block17_2_ac[0][0]
__________________________________________________________________________________________________
conv2d_88 (Conv2D) (None, 17, 17, 192) 215040 activation_87[0][0]
__________________________________________________________________________________________________
batch_normalization_85 (BatchNo (None, 17, 17, 192) 576 conv2d_85[0][0]
__________________________________________________________________________________________________
batch_normalization_88 (BatchNo (None, 17, 17, 192) 576 conv2d_88[0][0]
__________________________________________________________________________________________________
activation_85 (Activation) (None, 17, 17, 192) 0 batch_normalization_85[0][0]
__________________________________________________________________________________________________
activation_88 (Activation) (None, 17, 17, 192) 0 batch_normalization_88[0][0]
__________________________________________________________________________________________________
block17_3_mixed (Concatenate) (None, 17, 17, 384) 0 activation_85[0][0]
activation_88[0][0]
__________________________________________________________________________________________________
block17_3_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_3_mixed[0][0]
__________________________________________________________________________________________________
block17_3 (Lambda) (None, 17, 17, 1088) 0 block17_2_ac[0][0]
block17_3_conv[0][0]
__________________________________________________________________________________________________
block17_3_ac (Activation) (None, 17, 17, 1088) 0 block17_3[0][0]
__________________________________________________________________________________________________
conv2d_90 (Conv2D) (None, 17, 17, 128) 139264 block17_3_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_90 (BatchNo (None, 17, 17, 128) 384 conv2d_90[0][0]
__________________________________________________________________________________________________
activation_90 (Activation) (None, 17, 17, 128) 0 batch_normalization_90[0][0]
__________________________________________________________________________________________________
conv2d_91 (Conv2D) (None, 17, 17, 160) 143360 activation_90[0][0]
__________________________________________________________________________________________________
batch_normalization_91 (BatchNo (None, 17, 17, 160) 480 conv2d_91[0][0]
__________________________________________________________________________________________________
activation_91 (Activation) (None, 17, 17, 160) 0 batch_normalization_91[0][0]
__________________________________________________________________________________________________
conv2d_89 (Conv2D) (None, 17, 17, 192) 208896 block17_3_ac[0][0]
__________________________________________________________________________________________________
conv2d_92 (Conv2D) (None, 17, 17, 192) 215040 activation_91[0][0]
__________________________________________________________________________________________________
batch_normalization_89 (BatchNo (None, 17, 17, 192) 576 conv2d_89[0][0]
__________________________________________________________________________________________________
batch_normalization_92 (BatchNo (None, 17, 17, 192) 576 conv2d_92[0][0]
__________________________________________________________________________________________________
activation_89 (Activation) (None, 17, 17, 192) 0 batch_normalization_89[0][0]
__________________________________________________________________________________________________
activation_92 (Activation) (None, 17, 17, 192) 0 batch_normalization_92[0][0]
__________________________________________________________________________________________________
block17_4_mixed (Concatenate) (None, 17, 17, 384) 0 activation_89[0][0]
activation_92[0][0]
__________________________________________________________________________________________________
block17_4_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_4_mixed[0][0]
__________________________________________________________________________________________________
block17_4 (Lambda) (None, 17, 17, 1088) 0 block17_3_ac[0][0]
block17_4_conv[0][0]
__________________________________________________________________________________________________
block17_4_ac (Activation) (None, 17, 17, 1088) 0 block17_4[0][0]
__________________________________________________________________________________________________
conv2d_94 (Conv2D) (None, 17, 17, 128) 139264 block17_4_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_94 (BatchNo (None, 17, 17, 128) 384 conv2d_94[0][0]
__________________________________________________________________________________________________
activation_94 (Activation) (None, 17, 17, 128) 0 batch_normalization_94[0][0]
__________________________________________________________________________________________________
conv2d_95 (Conv2D) (None, 17, 17, 160) 143360 activation_94[0][0]
__________________________________________________________________________________________________
batch_normalization_95 (BatchNo (None, 17, 17, 160) 480 conv2d_95[0][0]
__________________________________________________________________________________________________
activation_95 (Activation) (None, 17, 17, 160) 0 batch_normalization_95[0][0]
__________________________________________________________________________________________________
conv2d_93 (Conv2D) (None, 17, 17, 192) 208896 block17_4_ac[0][0]
__________________________________________________________________________________________________
conv2d_96 (Conv2D) (None, 17, 17, 192) 215040 activation_95[0][0]
__________________________________________________________________________________________________
batch_normalization_93 (BatchNo (None, 17, 17, 192) 576 conv2d_93[0][0]
__________________________________________________________________________________________________
batch_normalization_96 (BatchNo (None, 17, 17, 192) 576 conv2d_96[0][0]
__________________________________________________________________________________________________
activation_93 (Activation) (None, 17, 17, 192) 0 batch_normalization_93[0][0]
__________________________________________________________________________________________________
activation_96 (Activation) (None, 17, 17, 192) 0 batch_normalization_96[0][0]
__________________________________________________________________________________________________
block17_5_mixed (Concatenate) (None, 17, 17, 384) 0 activation_93[0][0]
activation_96[0][0]
__________________________________________________________________________________________________
block17_5_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_5_mixed[0][0]
__________________________________________________________________________________________________
block17_5 (Lambda) (None, 17, 17, 1088) 0 block17_4_ac[0][0]
block17_5_conv[0][0]
__________________________________________________________________________________________________
block17_5_ac (Activation) (None, 17, 17, 1088) 0 block17_5[0][0]
__________________________________________________________________________________________________
conv2d_98 (Conv2D) (None, 17, 17, 128) 139264 block17_5_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_98 (BatchNo (None, 17, 17, 128) 384 conv2d_98[0][0]
__________________________________________________________________________________________________
activation_98 (Activation) (None, 17, 17, 128) 0 batch_normalization_98[0][0]
__________________________________________________________________________________________________
conv2d_99 (Conv2D) (None, 17, 17, 160) 143360 activation_98[0][0]
__________________________________________________________________________________________________
batch_normalization_99 (BatchNo (None, 17, 17, 160) 480 conv2d_99[0][0]
__________________________________________________________________________________________________
activation_99 (Activation) (None, 17, 17, 160) 0 batch_normalization_99[0][0]
__________________________________________________________________________________________________
conv2d_97 (Conv2D) (None, 17, 17, 192) 208896 block17_5_ac[0][0]
__________________________________________________________________________________________________
conv2d_100 (Conv2D) (None, 17, 17, 192) 215040 activation_99[0][0]
__________________________________________________________________________________________________
batch_normalization_97 (BatchNo (None, 17, 17, 192) 576 conv2d_97[0][0]
__________________________________________________________________________________________________
batch_normalization_100 (BatchN (None, 17, 17, 192) 576 conv2d_100[0][0]
__________________________________________________________________________________________________
activation_97 (Activation) (None, 17, 17, 192) 0 batch_normalization_97[0][0]
__________________________________________________________________________________________________
activation_100 (Activation) (None, 17, 17, 192) 0 batch_normalization_100[0][0]
__________________________________________________________________________________________________
block17_6_mixed (Concatenate) (None, 17, 17, 384) 0 activation_97[0][0]
activation_100[0][0]
__________________________________________________________________________________________________
block17_6_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_6_mixed[0][0]
__________________________________________________________________________________________________
block17_6 (Lambda) (None, 17, 17, 1088) 0 block17_5_ac[0][0]
block17_6_conv[0][0]
__________________________________________________________________________________________________
block17_6_ac (Activation) (None, 17, 17, 1088) 0 block17_6[0][0]
__________________________________________________________________________________________________
conv2d_102 (Conv2D) (None, 17, 17, 128) 139264 block17_6_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_102 (BatchN (None, 17, 17, 128) 384 conv2d_102[0][0]
__________________________________________________________________________________________________
activation_102 (Activation) (None, 17, 17, 128) 0 batch_normalization_102[0][0]
__________________________________________________________________________________________________
conv2d_103 (Conv2D) (None, 17, 17, 160) 143360 activation_102[0][0]
__________________________________________________________________________________________________
batch_normalization_103 (BatchN (None, 17, 17, 160) 480 conv2d_103[0][0]
__________________________________________________________________________________________________
activation_103 (Activation) (None, 17, 17, 160) 0 batch_normalization_103[0][0]
__________________________________________________________________________________________________
conv2d_101 (Conv2D) (None, 17, 17, 192) 208896 block17_6_ac[0][0]
__________________________________________________________________________________________________
conv2d_104 (Conv2D) (None, 17, 17, 192) 215040 activation_103[0][0]
__________________________________________________________________________________________________
batch_normalization_101 (BatchN (None, 17, 17, 192) 576 conv2d_101[0][0]
__________________________________________________________________________________________________
batch_normalization_104 (BatchN (None, 17, 17, 192) 576 conv2d_104[0][0]
__________________________________________________________________________________________________
activation_101 (Activation) (None, 17, 17, 192) 0 batch_normalization_101[0][0]
__________________________________________________________________________________________________
activation_104 (Activation) (None, 17, 17, 192) 0 batch_normalization_104[0][0]
__________________________________________________________________________________________________
block17_7_mixed (Concatenate) (None, 17, 17, 384) 0 activation_101[0][0]
activation_104[0][0]
__________________________________________________________________________________________________
block17_7_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_7_mixed[0][0]
__________________________________________________________________________________________________
block17_7 (Lambda) (None, 17, 17, 1088) 0 block17_6_ac[0][0]
block17_7_conv[0][0]
__________________________________________________________________________________________________
block17_7_ac (Activation) (None, 17, 17, 1088) 0 block17_7[0][0]
__________________________________________________________________________________________________
conv2d_106 (Conv2D) (None, 17, 17, 128) 139264 block17_7_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_106 (BatchN (None, 17, 17, 128) 384 conv2d_106[0][0]
__________________________________________________________________________________________________
activation_106 (Activation) (None, 17, 17, 128) 0 batch_normalization_106[0][0]
__________________________________________________________________________________________________
conv2d_107 (Conv2D) (None, 17, 17, 160) 143360 activation_106[0][0]
__________________________________________________________________________________________________
batch_normalization_107 (BatchN (None, 17, 17, 160) 480 conv2d_107[0][0]
__________________________________________________________________________________________________
activation_107 (Activation) (None, 17, 17, 160) 0 batch_normalization_107[0][0]
__________________________________________________________________________________________________
conv2d_105 (Conv2D) (None, 17, 17, 192) 208896 block17_7_ac[0][0]
__________________________________________________________________________________________________
conv2d_108 (Conv2D) (None, 17, 17, 192) 215040 activation_107[0][0]
__________________________________________________________________________________________________
batch_normalization_105 (BatchN (None, 17, 17, 192) 576 conv2d_105[0][0]
__________________________________________________________________________________________________
batch_normalization_108 (BatchN (None, 17, 17, 192) 576 conv2d_108[0][0]
__________________________________________________________________________________________________
activation_105 (Activation) (None, 17, 17, 192) 0 batch_normalization_105[0][0]
__________________________________________________________________________________________________
activation_108 (Activation) (None, 17, 17, 192) 0 batch_normalization_108[0][0]
__________________________________________________________________________________________________
block17_8_mixed (Concatenate) (None, 17, 17, 384) 0 activation_105[0][0]
activation_108[0][0]
__________________________________________________________________________________________________
block17_8_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_8_mixed[0][0]
__________________________________________________________________________________________________
block17_8 (Lambda) (None, 17, 17, 1088) 0 block17_7_ac[0][0]
block17_8_conv[0][0]
__________________________________________________________________________________________________
block17_8_ac (Activation) (None, 17, 17, 1088) 0 block17_8[0][0]
__________________________________________________________________________________________________
conv2d_110 (Conv2D) (None, 17, 17, 128) 139264 block17_8_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_110 (BatchN (None, 17, 17, 128) 384 conv2d_110[0][0]
__________________________________________________________________________________________________
activation_110 (Activation) (None, 17, 17, 128) 0 batch_normalization_110[0][0]
__________________________________________________________________________________________________
conv2d_111 (Conv2D) (None, 17, 17, 160) 143360 activation_110[0][0]
__________________________________________________________________________________________________
batch_normalization_111 (BatchN (None, 17, 17, 160) 480 conv2d_111[0][0]
__________________________________________________________________________________________________
activation_111 (Activation) (None, 17, 17, 160) 0 batch_normalization_111[0][0]
__________________________________________________________________________________________________
conv2d_109 (Conv2D) (None, 17, 17, 192) 208896 block17_8_ac[0][0]
__________________________________________________________________________________________________
conv2d_112 (Conv2D) (None, 17, 17, 192) 215040 activation_111[0][0]
__________________________________________________________________________________________________
batch_normalization_109 (BatchN (None, 17, 17, 192) 576 conv2d_109[0][0]
__________________________________________________________________________________________________
batch_normalization_112 (BatchN (None, 17, 17, 192) 576 conv2d_112[0][0]
__________________________________________________________________________________________________
activation_109 (Activation) (None, 17, 17, 192) 0 batch_normalization_109[0][0]
__________________________________________________________________________________________________
activation_112 (Activation) (None, 17, 17, 192) 0 batch_normalization_112[0][0]
__________________________________________________________________________________________________
block17_9_mixed (Concatenate) (None, 17, 17, 384) 0 activation_109[0][0]
activation_112[0][0]
__________________________________________________________________________________________________
block17_9_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_9_mixed[0][0]
__________________________________________________________________________________________________
block17_9 (Lambda) (None, 17, 17, 1088) 0 block17_8_ac[0][0]
block17_9_conv[0][0]
__________________________________________________________________________________________________
block17_9_ac (Activation) (None, 17, 17, 1088) 0 block17_9[0][0]
__________________________________________________________________________________________________
conv2d_114 (Conv2D) (None, 17, 17, 128) 139264 block17_9_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_114 (BatchN (None, 17, 17, 128) 384 conv2d_114[0][0]
__________________________________________________________________________________________________
activation_114 (Activation) (None, 17, 17, 128) 0 batch_normalization_114[0][0]
__________________________________________________________________________________________________
conv2d_115 (Conv2D) (None, 17, 17, 160) 143360 activation_114[0][0]
__________________________________________________________________________________________________
batch_normalization_115 (BatchN (None, 17, 17, 160) 480 conv2d_115[0][0]
__________________________________________________________________________________________________
activation_115 (Activation) (None, 17, 17, 160) 0 batch_normalization_115[0][0]
__________________________________________________________________________________________________
conv2d_113 (Conv2D) (None, 17, 17, 192) 208896 block17_9_ac[0][0]
__________________________________________________________________________________________________
conv2d_116 (Conv2D) (None, 17, 17, 192) 215040 activation_115[0][0]
__________________________________________________________________________________________________
batch_normalization_113 (BatchN (None, 17, 17, 192) 576 conv2d_113[0][0]
__________________________________________________________________________________________________
batch_normalization_116 (BatchN (None, 17, 17, 192) 576 conv2d_116[0][0]
__________________________________________________________________________________________________
activation_113 (Activation) (None, 17, 17, 192) 0 batch_normalization_113[0][0]
__________________________________________________________________________________________________
activation_116 (Activation) (None, 17, 17, 192) 0 batch_normalization_116[0][0]
__________________________________________________________________________________________________
block17_10_mixed (Concatenate) (None, 17, 17, 384) 0 activation_113[0][0]
activation_116[0][0]
__________________________________________________________________________________________________
block17_10_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_10_mixed[0][0]
__________________________________________________________________________________________________
block17_10 (Lambda) (None, 17, 17, 1088) 0 block17_9_ac[0][0]
block17_10_conv[0][0]
__________________________________________________________________________________________________
block17_10_ac (Activation) (None, 17, 17, 1088) 0 block17_10[0][0]
__________________________________________________________________________________________________
conv2d_118 (Conv2D) (None, 17, 17, 128) 139264 block17_10_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_118 (BatchN (None, 17, 17, 128) 384 conv2d_118[0][0]
__________________________________________________________________________________________________
activation_118 (Activation) (None, 17, 17, 128) 0 batch_normalization_118[0][0]
__________________________________________________________________________________________________
conv2d_119 (Conv2D) (None, 17, 17, 160) 143360 activation_118[0][0]
__________________________________________________________________________________________________
batch_normalization_119 (BatchN (None, 17, 17, 160) 480 conv2d_119[0][0]
__________________________________________________________________________________________________
activation_119 (Activation) (None, 17, 17, 160) 0 batch_normalization_119[0][0]
__________________________________________________________________________________________________
conv2d_117 (Conv2D) (None, 17, 17, 192) 208896 block17_10_ac[0][0]
__________________________________________________________________________________________________
conv2d_120 (Conv2D) (None, 17, 17, 192) 215040 activation_119[0][0]
__________________________________________________________________________________________________
batch_normalization_117 (BatchN (None, 17, 17, 192) 576 conv2d_117[0][0]
__________________________________________________________________________________________________
batch_normalization_120 (BatchN (None, 17, 17, 192) 576 conv2d_120[0][0]
__________________________________________________________________________________________________
activation_117 (Activation) (None, 17, 17, 192) 0 batch_normalization_117[0][0]
__________________________________________________________________________________________________
activation_120 (Activation) (None, 17, 17, 192) 0 batch_normalization_120[0][0]
__________________________________________________________________________________________________
block17_11_mixed (Concatenate) (None, 17, 17, 384) 0 activation_117[0][0]
activation_120[0][0]
__________________________________________________________________________________________________
block17_11_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_11_mixed[0][0]
__________________________________________________________________________________________________
block17_11 (Lambda) (None, 17, 17, 1088) 0 block17_10_ac[0][0]
block17_11_conv[0][0]
__________________________________________________________________________________________________
block17_11_ac (Activation) (None, 17, 17, 1088) 0 block17_11[0][0]
__________________________________________________________________________________________________
conv2d_122 (Conv2D) (None, 17, 17, 128) 139264 block17_11_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_122 (BatchN (None, 17, 17, 128) 384 conv2d_122[0][0]
__________________________________________________________________________________________________
activation_122 (Activation) (None, 17, 17, 128) 0 batch_normalization_122[0][0]
__________________________________________________________________________________________________
conv2d_123 (Conv2D) (None, 17, 17, 160) 143360 activation_122[0][0]
__________________________________________________________________________________________________
batch_normalization_123 (BatchN (None, 17, 17, 160) 480 conv2d_123[0][0]
__________________________________________________________________________________________________
activation_123 (Activation) (None, 17, 17, 160) 0 batch_normalization_123[0][0]
__________________________________________________________________________________________________
conv2d_121 (Conv2D) (None, 17, 17, 192) 208896 block17_11_ac[0][0]
__________________________________________________________________________________________________
conv2d_124 (Conv2D) (None, 17, 17, 192) 215040 activation_123[0][0]
__________________________________________________________________________________________________
batch_normalization_121 (BatchN (None, 17, 17, 192) 576 conv2d_121[0][0]
__________________________________________________________________________________________________
batch_normalization_124 (BatchN (None, 17, 17, 192) 576 conv2d_124[0][0]
__________________________________________________________________________________________________
activation_121 (Activation) (None, 17, 17, 192) 0 batch_normalization_121[0][0]
__________________________________________________________________________________________________
activation_124 (Activation) (None, 17, 17, 192) 0 batch_normalization_124[0][0]
__________________________________________________________________________________________________
block17_12_mixed (Concatenate) (None, 17, 17, 384) 0 activation_121[0][0]
activation_124[0][0]
__________________________________________________________________________________________________
block17_12_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_12_mixed[0][0]
__________________________________________________________________________________________________
block17_12 (Lambda) (None, 17, 17, 1088) 0 block17_11_ac[0][0]
block17_12_conv[0][0]
__________________________________________________________________________________________________
block17_12_ac (Activation) (None, 17, 17, 1088) 0 block17_12[0][0]
__________________________________________________________________________________________________
conv2d_126 (Conv2D) (None, 17, 17, 128) 139264 block17_12_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_126 (BatchN (None, 17, 17, 128) 384 conv2d_126[0][0]
__________________________________________________________________________________________________
activation_126 (Activation) (None, 17, 17, 128) 0 batch_normalization_126[0][0]
__________________________________________________________________________________________________
conv2d_127 (Conv2D) (None, 17, 17, 160) 143360 activation_126[0][0]
__________________________________________________________________________________________________
batch_normalization_127 (BatchN (None, 17, 17, 160) 480 conv2d_127[0][0]
__________________________________________________________________________________________________
activation_127 (Activation) (None, 17, 17, 160) 0 batch_normalization_127[0][0]
__________________________________________________________________________________________________
conv2d_125 (Conv2D) (None, 17, 17, 192) 208896 block17_12_ac[0][0]
__________________________________________________________________________________________________
conv2d_128 (Conv2D) (None, 17, 17, 192) 215040 activation_127[0][0]
__________________________________________________________________________________________________
batch_normalization_125 (BatchN (None, 17, 17, 192) 576 conv2d_125[0][0]
__________________________________________________________________________________________________
batch_normalization_128 (BatchN (None, 17, 17, 192) 576 conv2d_128[0][0]
__________________________________________________________________________________________________
activation_125 (Activation) (None, 17, 17, 192) 0 batch_normalization_125[0][0]
__________________________________________________________________________________________________
activation_128 (Activation) (None, 17, 17, 192) 0 batch_normalization_128[0][0]
__________________________________________________________________________________________________
block17_13_mixed (Concatenate) (None, 17, 17, 384) 0 activation_125[0][0]
activation_128[0][0]
__________________________________________________________________________________________________
block17_13_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_13_mixed[0][0]
__________________________________________________________________________________________________
block17_13 (Lambda) (None, 17, 17, 1088) 0 block17_12_ac[0][0]
block17_13_conv[0][0]
__________________________________________________________________________________________________
block17_13_ac (Activation) (None, 17, 17, 1088) 0 block17_13[0][0]
__________________________________________________________________________________________________
conv2d_130 (Conv2D) (None, 17, 17, 128) 139264 block17_13_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_130 (BatchN (None, 17, 17, 128) 384 conv2d_130[0][0]
__________________________________________________________________________________________________
activation_130 (Activation) (None, 17, 17, 128) 0 batch_normalization_130[0][0]
__________________________________________________________________________________________________
conv2d_131 (Conv2D) (None, 17, 17, 160) 143360 activation_130[0][0]
__________________________________________________________________________________________________
batch_normalization_131 (BatchN (None, 17, 17, 160) 480 conv2d_131[0][0]
__________________________________________________________________________________________________
activation_131 (Activation) (None, 17, 17, 160) 0 batch_normalization_131[0][0]
__________________________________________________________________________________________________
conv2d_129 (Conv2D) (None, 17, 17, 192) 208896 block17_13_ac[0][0]
__________________________________________________________________________________________________
conv2d_132 (Conv2D) (None, 17, 17, 192) 215040 activation_131[0][0]
__________________________________________________________________________________________________
batch_normalization_129 (BatchN (None, 17, 17, 192) 576 conv2d_129[0][0]
__________________________________________________________________________________________________
batch_normalization_132 (BatchN (None, 17, 17, 192) 576 conv2d_132[0][0]
__________________________________________________________________________________________________
activation_129 (Activation) (None, 17, 17, 192) 0 batch_normalization_129[0][0]
__________________________________________________________________________________________________
activation_132 (Activation) (None, 17, 17, 192) 0 batch_normalization_132[0][0]
__________________________________________________________________________________________________
block17_14_mixed (Concatenate) (None, 17, 17, 384) 0 activation_129[0][0]
activation_132[0][0]
__________________________________________________________________________________________________
block17_14_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_14_mixed[0][0]
__________________________________________________________________________________________________
block17_14 (Lambda) (None, 17, 17, 1088) 0 block17_13_ac[0][0]
block17_14_conv[0][0]
__________________________________________________________________________________________________
block17_14_ac (Activation) (None, 17, 17, 1088) 0 block17_14[0][0]
__________________________________________________________________________________________________
conv2d_134 (Conv2D) (None, 17, 17, 128) 139264 block17_14_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_134 (BatchN (None, 17, 17, 128) 384 conv2d_134[0][0]
__________________________________________________________________________________________________
activation_134 (Activation) (None, 17, 17, 128) 0 batch_normalization_134[0][0]
__________________________________________________________________________________________________
conv2d_135 (Conv2D) (None, 17, 17, 160) 143360 activation_134[0][0]
__________________________________________________________________________________________________
batch_normalization_135 (BatchN (None, 17, 17, 160) 480 conv2d_135[0][0]
__________________________________________________________________________________________________
activation_135 (Activation) (None, 17, 17, 160) 0 batch_normalization_135[0][0]
__________________________________________________________________________________________________
conv2d_133 (Conv2D) (None, 17, 17, 192) 208896 block17_14_ac[0][0]
__________________________________________________________________________________________________
conv2d_136 (Conv2D) (None, 17, 17, 192) 215040 activation_135[0][0]
__________________________________________________________________________________________________
batch_normalization_133 (BatchN (None, 17, 17, 192) 576 conv2d_133[0][0]
__________________________________________________________________________________________________
batch_normalization_136 (BatchN (None, 17, 17, 192) 576 conv2d_136[0][0]
__________________________________________________________________________________________________
activation_133 (Activation) (None, 17, 17, 192) 0 batch_normalization_133[0][0]
__________________________________________________________________________________________________
activation_136 (Activation) (None, 17, 17, 192) 0 batch_normalization_136[0][0]
__________________________________________________________________________________________________
block17_15_mixed (Concatenate) (None, 17, 17, 384) 0 activation_133[0][0]
activation_136[0][0]
__________________________________________________________________________________________________
block17_15_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_15_mixed[0][0]
__________________________________________________________________________________________________
block17_15 (Lambda) (None, 17, 17, 1088) 0 block17_14_ac[0][0]
block17_15_conv[0][0]
__________________________________________________________________________________________________
block17_15_ac (Activation) (None, 17, 17, 1088) 0 block17_15[0][0]
__________________________________________________________________________________________________
conv2d_138 (Conv2D) (None, 17, 17, 128) 139264 block17_15_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_138 (BatchN (None, 17, 17, 128) 384 conv2d_138[0][0]
__________________________________________________________________________________________________
activation_138 (Activation) (None, 17, 17, 128) 0 batch_normalization_138[0][0]
__________________________________________________________________________________________________
conv2d_139 (Conv2D) (None, 17, 17, 160) 143360 activation_138[0][0]
__________________________________________________________________________________________________
batch_normalization_139 (BatchN (None, 17, 17, 160) 480 conv2d_139[0][0]
__________________________________________________________________________________________________
activation_139 (Activation) (None, 17, 17, 160) 0 batch_normalization_139[0][0]
__________________________________________________________________________________________________
conv2d_137 (Conv2D) (None, 17, 17, 192) 208896 block17_15_ac[0][0]
__________________________________________________________________________________________________
conv2d_140 (Conv2D) (None, 17, 17, 192) 215040 activation_139[0][0]
__________________________________________________________________________________________________
batch_normalization_137 (BatchN (None, 17, 17, 192) 576 conv2d_137[0][0]
__________________________________________________________________________________________________
batch_normalization_140 (BatchN (None, 17, 17, 192) 576 conv2d_140[0][0]
__________________________________________________________________________________________________
activation_137 (Activation) (None, 17, 17, 192) 0 batch_normalization_137[0][0]
__________________________________________________________________________________________________
activation_140 (Activation) (None, 17, 17, 192) 0 batch_normalization_140[0][0]
__________________________________________________________________________________________________
block17_16_mixed (Concatenate) (None, 17, 17, 384) 0 activation_137[0][0]
activation_140[0][0]
__________________________________________________________________________________________________
block17_16_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_16_mixed[0][0]
__________________________________________________________________________________________________
block17_16 (Lambda) (None, 17, 17, 1088) 0 block17_15_ac[0][0]
block17_16_conv[0][0]
__________________________________________________________________________________________________
block17_16_ac (Activation) (None, 17, 17, 1088) 0 block17_16[0][0]
__________________________________________________________________________________________________
conv2d_142 (Conv2D) (None, 17, 17, 128) 139264 block17_16_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_142 (BatchN (None, 17, 17, 128) 384 conv2d_142[0][0]
__________________________________________________________________________________________________
activation_142 (Activation) (None, 17, 17, 128) 0 batch_normalization_142[0][0]
__________________________________________________________________________________________________
conv2d_143 (Conv2D) (None, 17, 17, 160) 143360 activation_142[0][0]
__________________________________________________________________________________________________
batch_normalization_143 (BatchN (None, 17, 17, 160) 480 conv2d_143[0][0]
__________________________________________________________________________________________________
activation_143 (Activation) (None, 17, 17, 160) 0 batch_normalization_143[0][0]
__________________________________________________________________________________________________
conv2d_141 (Conv2D) (None, 17, 17, 192) 208896 block17_16_ac[0][0]
__________________________________________________________________________________________________
conv2d_144 (Conv2D) (None, 17, 17, 192) 215040 activation_143[0][0]
__________________________________________________________________________________________________
batch_normalization_141 (BatchN (None, 17, 17, 192) 576 conv2d_141[0][0]
__________________________________________________________________________________________________
batch_normalization_144 (BatchN (None, 17, 17, 192) 576 conv2d_144[0][0]
__________________________________________________________________________________________________
activation_141 (Activation) (None, 17, 17, 192) 0 batch_normalization_141[0][0]
__________________________________________________________________________________________________
activation_144 (Activation) (None, 17, 17, 192) 0 batch_normalization_144[0][0]
__________________________________________________________________________________________________
block17_17_mixed (Concatenate) (None, 17, 17, 384) 0 activation_141[0][0]
activation_144[0][0]
__________________________________________________________________________________________________
block17_17_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_17_mixed[0][0]
__________________________________________________________________________________________________
block17_17 (Lambda) (None, 17, 17, 1088) 0 block17_16_ac[0][0]
block17_17_conv[0][0]
__________________________________________________________________________________________________
block17_17_ac (Activation) (None, 17, 17, 1088) 0 block17_17[0][0]
__________________________________________________________________________________________________
conv2d_146 (Conv2D) (None, 17, 17, 128) 139264 block17_17_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_146 (BatchN (None, 17, 17, 128) 384 conv2d_146[0][0]
__________________________________________________________________________________________________
activation_146 (Activation) (None, 17, 17, 128) 0 batch_normalization_146[0][0]
__________________________________________________________________________________________________
conv2d_147 (Conv2D) (None, 17, 17, 160) 143360 activation_146[0][0]
__________________________________________________________________________________________________
batch_normalization_147 (BatchN (None, 17, 17, 160) 480 conv2d_147[0][0]
__________________________________________________________________________________________________
activation_147 (Activation) (None, 17, 17, 160) 0 batch_normalization_147[0][0]
__________________________________________________________________________________________________
conv2d_145 (Conv2D) (None, 17, 17, 192) 208896 block17_17_ac[0][0]
__________________________________________________________________________________________________
conv2d_148 (Conv2D) (None, 17, 17, 192) 215040 activation_147[0][0]
__________________________________________________________________________________________________
batch_normalization_145 (BatchN (None, 17, 17, 192) 576 conv2d_145[0][0]
__________________________________________________________________________________________________
batch_normalization_148 (BatchN (None, 17, 17, 192) 576 conv2d_148[0][0]
__________________________________________________________________________________________________
activation_145 (Activation) (None, 17, 17, 192) 0 batch_normalization_145[0][0]
__________________________________________________________________________________________________
activation_148 (Activation) (None, 17, 17, 192) 0 batch_normalization_148[0][0]
__________________________________________________________________________________________________
block17_18_mixed (Concatenate) (None, 17, 17, 384) 0 activation_145[0][0]
activation_148[0][0]
__________________________________________________________________________________________________
block17_18_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_18_mixed[0][0]
__________________________________________________________________________________________________
block17_18 (Lambda) (None, 17, 17, 1088) 0 block17_17_ac[0][0]
block17_18_conv[0][0]
__________________________________________________________________________________________________
block17_18_ac (Activation) (None, 17, 17, 1088) 0 block17_18[0][0]
__________________________________________________________________________________________________
conv2d_150 (Conv2D) (None, 17, 17, 128) 139264 block17_18_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_150 (BatchN (None, 17, 17, 128) 384 conv2d_150[0][0]
__________________________________________________________________________________________________
activation_150 (Activation) (None, 17, 17, 128) 0 batch_normalization_150[0][0]
__________________________________________________________________________________________________
conv2d_151 (Conv2D) (None, 17, 17, 160) 143360 activation_150[0][0]
__________________________________________________________________________________________________
batch_normalization_151 (BatchN (None, 17, 17, 160) 480 conv2d_151[0][0]
__________________________________________________________________________________________________
activation_151 (Activation) (None, 17, 17, 160) 0 batch_normalization_151[0][0]
__________________________________________________________________________________________________
conv2d_149 (Conv2D) (None, 17, 17, 192) 208896 block17_18_ac[0][0]
__________________________________________________________________________________________________
conv2d_152 (Conv2D) (None, 17, 17, 192) 215040 activation_151[0][0]
__________________________________________________________________________________________________
batch_normalization_149 (BatchN (None, 17, 17, 192) 576 conv2d_149[0][0]
__________________________________________________________________________________________________
batch_normalization_152 (BatchN (None, 17, 17, 192) 576 conv2d_152[0][0]
__________________________________________________________________________________________________
activation_149 (Activation) (None, 17, 17, 192) 0 batch_normalization_149[0][0]
__________________________________________________________________________________________________
activation_152 (Activation) (None, 17, 17, 192) 0 batch_normalization_152[0][0]
__________________________________________________________________________________________________
block17_19_mixed (Concatenate) (None, 17, 17, 384) 0 activation_149[0][0]
activation_152[0][0]
__________________________________________________________________________________________________
block17_19_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_19_mixed[0][0]
__________________________________________________________________________________________________
block17_19 (Lambda) (None, 17, 17, 1088) 0 block17_18_ac[0][0]
block17_19_conv[0][0]
__________________________________________________________________________________________________
block17_19_ac (Activation) (None, 17, 17, 1088) 0 block17_19[0][0]
__________________________________________________________________________________________________
conv2d_154 (Conv2D) (None, 17, 17, 128) 139264 block17_19_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_154 (BatchN (None, 17, 17, 128) 384 conv2d_154[0][0]
__________________________________________________________________________________________________
activation_154 (Activation) (None, 17, 17, 128) 0 batch_normalization_154[0][0]
__________________________________________________________________________________________________
conv2d_155 (Conv2D) (None, 17, 17, 160) 143360 activation_154[0][0]
__________________________________________________________________________________________________
batch_normalization_155 (BatchN (None, 17, 17, 160) 480 conv2d_155[0][0]
__________________________________________________________________________________________________
activation_155 (Activation) (None, 17, 17, 160) 0 batch_normalization_155[0][0]
__________________________________________________________________________________________________
conv2d_153 (Conv2D) (None, 17, 17, 192) 208896 block17_19_ac[0][0]
__________________________________________________________________________________________________
conv2d_156 (Conv2D) (None, 17, 17, 192) 215040 activation_155[0][0]
__________________________________________________________________________________________________
batch_normalization_153 (BatchN (None, 17, 17, 192) 576 conv2d_153[0][0]
__________________________________________________________________________________________________
batch_normalization_156 (BatchN (None, 17, 17, 192) 576 conv2d_156[0][0]
__________________________________________________________________________________________________
activation_153 (Activation) (None, 17, 17, 192) 0 batch_normalization_153[0][0]
__________________________________________________________________________________________________
activation_156 (Activation) (None, 17, 17, 192) 0 batch_normalization_156[0][0]
__________________________________________________________________________________________________
block17_20_mixed (Concatenate) (None, 17, 17, 384) 0 activation_153[0][0]
activation_156[0][0]
__________________________________________________________________________________________________
block17_20_conv (Conv2D) (None, 17, 17, 1088) 418880 block17_20_mixed[0][0]
__________________________________________________________________________________________________
block17_20 (Lambda) (None, 17, 17, 1088) 0 block17_19_ac[0][0]
block17_20_conv[0][0]
__________________________________________________________________________________________________
block17_20_ac (Activation) (None, 17, 17, 1088) 0 block17_20[0][0]
__________________________________________________________________________________________________
conv2d_161 (Conv2D) (None, 17, 17, 256) 278528 block17_20_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_161 (BatchN (None, 17, 17, 256) 768 conv2d_161[0][0]
__________________________________________________________________________________________________
activation_161 (Activation) (None, 17, 17, 256) 0 batch_normalization_161[0][0]
__________________________________________________________________________________________________
conv2d_157 (Conv2D) (None, 17, 17, 256) 278528 block17_20_ac[0][0]
__________________________________________________________________________________________________
conv2d_159 (Conv2D) (None, 17, 17, 256) 278528 block17_20_ac[0][0]
__________________________________________________________________________________________________
conv2d_162 (Conv2D) (None, 17, 17, 288) 663552 activation_161[0][0]
__________________________________________________________________________________________________
batch_normalization_157 (BatchN (None, 17, 17, 256) 768 conv2d_157[0][0]
__________________________________________________________________________________________________
batch_normalization_159 (BatchN (None, 17, 17, 256) 768 conv2d_159[0][0]
__________________________________________________________________________________________________
batch_normalization_162 (BatchN (None, 17, 17, 288) 864 conv2d_162[0][0]
__________________________________________________________________________________________________
activation_157 (Activation) (None, 17, 17, 256) 0 batch_normalization_157[0][0]
__________________________________________________________________________________________________
activation_159 (Activation) (None, 17, 17, 256) 0 batch_normalization_159[0][0]
__________________________________________________________________________________________________
activation_162 (Activation) (None, 17, 17, 288) 0 batch_normalization_162[0][0]
__________________________________________________________________________________________________
conv2d_158 (Conv2D) (None, 8, 8, 384) 884736 activation_157[0][0]
__________________________________________________________________________________________________
conv2d_160 (Conv2D) (None, 8, 8, 288) 663552 activation_159[0][0]
__________________________________________________________________________________________________
conv2d_163 (Conv2D) (None, 8, 8, 320) 829440 activation_162[0][0]
__________________________________________________________________________________________________
batch_normalization_158 (BatchN (None, 8, 8, 384) 1152 conv2d_158[0][0]
__________________________________________________________________________________________________
batch_normalization_160 (BatchN (None, 8, 8, 288) 864 conv2d_160[0][0]
__________________________________________________________________________________________________
batch_normalization_163 (BatchN (None, 8, 8, 320) 960 conv2d_163[0][0]
__________________________________________________________________________________________________
activation_158 (Activation) (None, 8, 8, 384) 0 batch_normalization_158[0][0]
__________________________________________________________________________________________________
activation_160 (Activation) (None, 8, 8, 288) 0 batch_normalization_160[0][0]
__________________________________________________________________________________________________
activation_163 (Activation) (None, 8, 8, 320) 0 batch_normalization_163[0][0]
__________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D) (None, 8, 8, 1088) 0 block17_20_ac[0][0]
__________________________________________________________________________________________________
mixed_7a (Concatenate) (None, 8, 8, 2080) 0 activation_158[0][0]
activation_160[0][0]
activation_163[0][0]
max_pooling2d_4[0][0]
__________________________________________________________________________________________________
conv2d_165 (Conv2D) (None, 8, 8, 192) 399360 mixed_7a[0][0]
__________________________________________________________________________________________________
batch_normalization_165 (BatchN (None, 8, 8, 192) 576 conv2d_165[0][0]
__________________________________________________________________________________________________
activation_165 (Activation) (None, 8, 8, 192) 0 batch_normalization_165[0][0]
__________________________________________________________________________________________________
conv2d_166 (Conv2D) (None, 8, 8, 224) 129024 activation_165[0][0]
__________________________________________________________________________________________________
batch_normalization_166 (BatchN (None, 8, 8, 224) 672 conv2d_166[0][0]
__________________________________________________________________________________________________
activation_166 (Activation) (None, 8, 8, 224) 0 batch_normalization_166[0][0]
__________________________________________________________________________________________________
conv2d_164 (Conv2D) (None, 8, 8, 192) 399360 mixed_7a[0][0]
__________________________________________________________________________________________________
conv2d_167 (Conv2D) (None, 8, 8, 256) 172032 activation_166[0][0]
__________________________________________________________________________________________________
batch_normalization_164 (BatchN (None, 8, 8, 192) 576 conv2d_164[0][0]
__________________________________________________________________________________________________
batch_normalization_167 (BatchN (None, 8, 8, 256) 768 conv2d_167[0][0]
__________________________________________________________________________________________________
activation_164 (Activation) (None, 8, 8, 192) 0 batch_normalization_164[0][0]
__________________________________________________________________________________________________
activation_167 (Activation) (None, 8, 8, 256) 0 batch_normalization_167[0][0]
__________________________________________________________________________________________________
block8_1_mixed (Concatenate) (None, 8, 8, 448) 0 activation_164[0][0]
activation_167[0][0]
__________________________________________________________________________________________________
block8_1_conv (Conv2D) (None, 8, 8, 2080) 933920 block8_1_mixed[0][0]
__________________________________________________________________________________________________
block8_1 (Lambda) (None, 8, 8, 2080) 0 mixed_7a[0][0]
block8_1_conv[0][0]
__________________________________________________________________________________________________
block8_1_ac (Activation) (None, 8, 8, 2080) 0 block8_1[0][0]
__________________________________________________________________________________________________
conv2d_169 (Conv2D) (None, 8, 8, 192) 399360 block8_1_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_169 (BatchN (None, 8, 8, 192) 576 conv2d_169[0][0]
__________________________________________________________________________________________________
activation_169 (Activation) (None, 8, 8, 192) 0 batch_normalization_169[0][0]
__________________________________________________________________________________________________
conv2d_170 (Conv2D) (None, 8, 8, 224) 129024 activation_169[0][0]
__________________________________________________________________________________________________
batch_normalization_170 (BatchN (None, 8, 8, 224) 672 conv2d_170[0][0]
__________________________________________________________________________________________________
activation_170 (Activation) (None, 8, 8, 224) 0 batch_normalization_170[0][0]
__________________________________________________________________________________________________
conv2d_168 (Conv2D) (None, 8, 8, 192) 399360 block8_1_ac[0][0]
__________________________________________________________________________________________________
conv2d_171 (Conv2D) (None, 8, 8, 256) 172032 activation_170[0][0]
__________________________________________________________________________________________________
batch_normalization_168 (BatchN (None, 8, 8, 192) 576 conv2d_168[0][0]
__________________________________________________________________________________________________
batch_normalization_171 (BatchN (None, 8, 8, 256) 768 conv2d_171[0][0]
__________________________________________________________________________________________________
activation_168 (Activation) (None, 8, 8, 192) 0 batch_normalization_168[0][0]
__________________________________________________________________________________________________
activation_171 (Activation) (None, 8, 8, 256) 0 batch_normalization_171[0][0]
__________________________________________________________________________________________________
block8_2_mixed (Concatenate) (None, 8, 8, 448) 0 activation_168[0][0]
activation_171[0][0]
__________________________________________________________________________________________________
block8_2_conv (Conv2D) (None, 8, 8, 2080) 933920 block8_2_mixed[0][0]
__________________________________________________________________________________________________
block8_2 (Lambda) (None, 8, 8, 2080) 0 block8_1_ac[0][0]
block8_2_conv[0][0]
__________________________________________________________________________________________________
block8_2_ac (Activation) (None, 8, 8, 2080) 0 block8_2[0][0]
__________________________________________________________________________________________________
conv2d_173 (Conv2D) (None, 8, 8, 192) 399360 block8_2_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_173 (BatchN (None, 8, 8, 192) 576 conv2d_173[0][0]
__________________________________________________________________________________________________
activation_173 (Activation) (None, 8, 8, 192) 0 batch_normalization_173[0][0]
__________________________________________________________________________________________________
conv2d_174 (Conv2D) (None, 8, 8, 224) 129024 activation_173[0][0]
__________________________________________________________________________________________________
batch_normalization_174 (BatchN (None, 8, 8, 224) 672 conv2d_174[0][0]
__________________________________________________________________________________________________
activation_174 (Activation) (None, 8, 8, 224) 0 batch_normalization_174[0][0]
__________________________________________________________________________________________________
conv2d_172 (Conv2D) (None, 8, 8, 192) 399360 block8_2_ac[0][0]
__________________________________________________________________________________________________
conv2d_175 (Conv2D) (None, 8, 8, 256) 172032 activation_174[0][0]
__________________________________________________________________________________________________
batch_normalization_172 (BatchN (None, 8, 8, 192) 576 conv2d_172[0][0]
__________________________________________________________________________________________________
batch_normalization_175 (BatchN (None, 8, 8, 256) 768 conv2d_175[0][0]
__________________________________________________________________________________________________
activation_172 (Activation) (None, 8, 8, 192) 0 batch_normalization_172[0][0]
__________________________________________________________________________________________________
activation_175 (Activation) (None, 8, 8, 256) 0 batch_normalization_175[0][0]
__________________________________________________________________________________________________
block8_3_mixed (Concatenate) (None, 8, 8, 448) 0 activation_172[0][0]
activation_175[0][0]
__________________________________________________________________________________________________
block8_3_conv (Conv2D) (None, 8, 8, 2080) 933920 block8_3_mixed[0][0]
__________________________________________________________________________________________________
block8_3 (Lambda) (None, 8, 8, 2080) 0 block8_2_ac[0][0]
block8_3_conv[0][0]
__________________________________________________________________________________________________
block8_3_ac (Activation) (None, 8, 8, 2080) 0 block8_3[0][0]
__________________________________________________________________________________________________
conv2d_177 (Conv2D) (None, 8, 8, 192) 399360 block8_3_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_177 (BatchN (None, 8, 8, 192) 576 conv2d_177[0][0]
__________________________________________________________________________________________________
activation_177 (Activation) (None, 8, 8, 192) 0 batch_normalization_177[0][0]
__________________________________________________________________________________________________
conv2d_178 (Conv2D) (None, 8, 8, 224) 129024 activation_177[0][0]
__________________________________________________________________________________________________
batch_normalization_178 (BatchN (None, 8, 8, 224) 672 conv2d_178[0][0]
__________________________________________________________________________________________________
activation_178 (Activation) (None, 8, 8, 224) 0 batch_normalization_178[0][0]
__________________________________________________________________________________________________
conv2d_176 (Conv2D) (None, 8, 8, 192) 399360 block8_3_ac[0][0]
__________________________________________________________________________________________________
conv2d_179 (Conv2D) (None, 8, 8, 256) 172032 activation_178[0][0]
__________________________________________________________________________________________________
batch_normalization_176 (BatchN (None, 8, 8, 192) 576 conv2d_176[0][0]
__________________________________________________________________________________________________
batch_normalization_179 (BatchN (None, 8, 8, 256) 768 conv2d_179[0][0]
__________________________________________________________________________________________________
activation_176 (Activation) (None, 8, 8, 192) 0 batch_normalization_176[0][0]
__________________________________________________________________________________________________
activation_179 (Activation) (None, 8, 8, 256) 0 batch_normalization_179[0][0]
__________________________________________________________________________________________________
block8_4_mixed (Concatenate) (None, 8, 8, 448) 0 activation_176[0][0]
activation_179[0][0]
__________________________________________________________________________________________________
block8_4_conv (Conv2D) (None, 8, 8, 2080) 933920 block8_4_mixed[0][0]
__________________________________________________________________________________________________
block8_4 (Lambda) (None, 8, 8, 2080) 0 block8_3_ac[0][0]
block8_4_conv[0][0]
__________________________________________________________________________________________________
block8_4_ac (Activation) (None, 8, 8, 2080) 0 block8_4[0][0]
__________________________________________________________________________________________________
conv2d_181 (Conv2D) (None, 8, 8, 192) 399360 block8_4_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_181 (BatchN (None, 8, 8, 192) 576 conv2d_181[0][0]
__________________________________________________________________________________________________
activation_181 (Activation) (None, 8, 8, 192) 0 batch_normalization_181[0][0]
__________________________________________________________________________________________________
conv2d_182 (Conv2D) (None, 8, 8, 224) 129024 activation_181[0][0]
__________________________________________________________________________________________________
batch_normalization_182 (BatchN (None, 8, 8, 224) 672 conv2d_182[0][0]
__________________________________________________________________________________________________
activation_182 (Activation) (None, 8, 8, 224) 0 batch_normalization_182[0][0]
__________________________________________________________________________________________________
conv2d_180 (Conv2D) (None, 8, 8, 192) 399360 block8_4_ac[0][0]
__________________________________________________________________________________________________
conv2d_183 (Conv2D) (None, 8, 8, 256) 172032 activation_182[0][0]
__________________________________________________________________________________________________
batch_normalization_180 (BatchN (None, 8, 8, 192) 576 conv2d_180[0][0]
__________________________________________________________________________________________________
batch_normalization_183 (BatchN (None, 8, 8, 256) 768 conv2d_183[0][0]
__________________________________________________________________________________________________
activation_180 (Activation) (None, 8, 8, 192) 0 batch_normalization_180[0][0]
__________________________________________________________________________________________________
activation_183 (Activation) (None, 8, 8, 256) 0 batch_normalization_183[0][0]
__________________________________________________________________________________________________
block8_5_mixed (Concatenate) (None, 8, 8, 448) 0 activation_180[0][0]
activation_183[0][0]
__________________________________________________________________________________________________
block8_5_conv (Conv2D) (None, 8, 8, 2080) 933920 block8_5_mixed[0][0]
__________________________________________________________________________________________________
block8_5 (Lambda) (None, 8, 8, 2080) 0 block8_4_ac[0][0]
block8_5_conv[0][0]
__________________________________________________________________________________________________
block8_5_ac (Activation) (None, 8, 8, 2080) 0 block8_5[0][0]
__________________________________________________________________________________________________
conv2d_185 (Conv2D) (None, 8, 8, 192) 399360 block8_5_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_185 (BatchN (None, 8, 8, 192) 576 conv2d_185[0][0]
__________________________________________________________________________________________________
activation_185 (Activation) (None, 8, 8, 192) 0 batch_normalization_185[0][0]
__________________________________________________________________________________________________
conv2d_186 (Conv2D) (None, 8, 8, 224) 129024 activation_185[0][0]
__________________________________________________________________________________________________
batch_normalization_186 (BatchN (None, 8, 8, 224) 672 conv2d_186[0][0]
__________________________________________________________________________________________________
activation_186 (Activation) (None, 8, 8, 224) 0 batch_normalization_186[0][0]
__________________________________________________________________________________________________
conv2d_184 (Conv2D) (None, 8, 8, 192) 399360 block8_5_ac[0][0]
__________________________________________________________________________________________________
conv2d_187 (Conv2D) (None, 8, 8, 256) 172032 activation_186[0][0]
__________________________________________________________________________________________________
batch_normalization_184 (BatchN (None, 8, 8, 192) 576 conv2d_184[0][0]
__________________________________________________________________________________________________
batch_normalization_187 (BatchN (None, 8, 8, 256) 768 conv2d_187[0][0]
__________________________________________________________________________________________________
activation_184 (Activation) (None, 8, 8, 192) 0 batch_normalization_184[0][0]
__________________________________________________________________________________________________
activation_187 (Activation) (None, 8, 8, 256) 0 batch_normalization_187[0][0]
__________________________________________________________________________________________________
block8_6_mixed (Concatenate) (None, 8, 8, 448) 0 activation_184[0][0]
activation_187[0][0]
__________________________________________________________________________________________________
block8_6_conv (Conv2D) (None, 8, 8, 2080) 933920 block8_6_mixed[0][0]
__________________________________________________________________________________________________
block8_6 (Lambda) (None, 8, 8, 2080) 0 block8_5_ac[0][0]
block8_6_conv[0][0]
__________________________________________________________________________________________________
block8_6_ac (Activation) (None, 8, 8, 2080) 0 block8_6[0][0]
__________________________________________________________________________________________________
conv2d_189 (Conv2D) (None, 8, 8, 192) 399360 block8_6_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_189 (BatchN (None, 8, 8, 192) 576 conv2d_189[0][0]
__________________________________________________________________________________________________
activation_189 (Activation) (None, 8, 8, 192) 0 batch_normalization_189[0][0]
__________________________________________________________________________________________________
conv2d_190 (Conv2D) (None, 8, 8, 224) 129024 activation_189[0][0]
__________________________________________________________________________________________________
batch_normalization_190 (BatchN (None, 8, 8, 224) 672 conv2d_190[0][0]
__________________________________________________________________________________________________
activation_190 (Activation) (None, 8, 8, 224) 0 batch_normalization_190[0][0]
__________________________________________________________________________________________________
conv2d_188 (Conv2D) (None, 8, 8, 192) 399360 block8_6_ac[0][0]
__________________________________________________________________________________________________
conv2d_191 (Conv2D) (None, 8, 8, 256) 172032 activation_190[0][0]
__________________________________________________________________________________________________
batch_normalization_188 (BatchN (None, 8, 8, 192) 576 conv2d_188[0][0]
__________________________________________________________________________________________________
batch_normalization_191 (BatchN (None, 8, 8, 256) 768 conv2d_191[0][0]
__________________________________________________________________________________________________
activation_188 (Activation) (None, 8, 8, 192) 0 batch_normalization_188[0][0]
__________________________________________________________________________________________________
activation_191 (Activation) (None, 8, 8, 256) 0 batch_normalization_191[0][0]
__________________________________________________________________________________________________
block8_7_mixed (Concatenate) (None, 8, 8, 448) 0 activation_188[0][0]
activation_191[0][0]
__________________________________________________________________________________________________
block8_7_conv (Conv2D) (None, 8, 8, 2080) 933920 block8_7_mixed[0][0]
__________________________________________________________________________________________________
block8_7 (Lambda) (None, 8, 8, 2080) 0 block8_6_ac[0][0]
block8_7_conv[0][0]
__________________________________________________________________________________________________
block8_7_ac (Activation) (None, 8, 8, 2080) 0 block8_7[0][0]
__________________________________________________________________________________________________
conv2d_193 (Conv2D) (None, 8, 8, 192) 399360 block8_7_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_193 (BatchN (None, 8, 8, 192) 576 conv2d_193[0][0]
__________________________________________________________________________________________________
activation_193 (Activation) (None, 8, 8, 192) 0 batch_normalization_193[0][0]
__________________________________________________________________________________________________
conv2d_194 (Conv2D) (None, 8, 8, 224) 129024 activation_193[0][0]
__________________________________________________________________________________________________
batch_normalization_194 (BatchN (None, 8, 8, 224) 672 conv2d_194[0][0]
__________________________________________________________________________________________________
activation_194 (Activation) (None, 8, 8, 224) 0 batch_normalization_194[0][0]
__________________________________________________________________________________________________
conv2d_192 (Conv2D) (None, 8, 8, 192) 399360 block8_7_ac[0][0]
__________________________________________________________________________________________________
conv2d_195 (Conv2D) (None, 8, 8, 256) 172032 activation_194[0][0]
__________________________________________________________________________________________________
batch_normalization_192 (BatchN (None, 8, 8, 192) 576 conv2d_192[0][0]
__________________________________________________________________________________________________
batch_normalization_195 (BatchN (None, 8, 8, 256) 768 conv2d_195[0][0]
__________________________________________________________________________________________________
activation_192 (Activation) (None, 8, 8, 192) 0 batch_normalization_192[0][0]
__________________________________________________________________________________________________
activation_195 (Activation) (None, 8, 8, 256) 0 batch_normalization_195[0][0]
__________________________________________________________________________________________________
block8_8_mixed (Concatenate) (None, 8, 8, 448) 0 activation_192[0][0]
activation_195[0][0]
__________________________________________________________________________________________________
block8_8_conv (Conv2D) (None, 8, 8, 2080) 933920 block8_8_mixed[0][0]
__________________________________________________________________________________________________
block8_8 (Lambda) (None, 8, 8, 2080) 0 block8_7_ac[0][0]
block8_8_conv[0][0]
__________________________________________________________________________________________________
block8_8_ac (Activation) (None, 8, 8, 2080) 0 block8_8[0][0]
__________________________________________________________________________________________________
conv2d_197 (Conv2D) (None, 8, 8, 192) 399360 block8_8_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_197 (BatchN (None, 8, 8, 192) 576 conv2d_197[0][0]
__________________________________________________________________________________________________
activation_197 (Activation) (None, 8, 8, 192) 0 batch_normalization_197[0][0]
__________________________________________________________________________________________________
conv2d_198 (Conv2D) (None, 8, 8, 224) 129024 activation_197[0][0]
__________________________________________________________________________________________________
batch_normalization_198 (BatchN (None, 8, 8, 224) 672 conv2d_198[0][0]
__________________________________________________________________________________________________
activation_198 (Activation) (None, 8, 8, 224) 0 batch_normalization_198[0][0]
__________________________________________________________________________________________________
conv2d_196 (Conv2D) (None, 8, 8, 192) 399360 block8_8_ac[0][0]
__________________________________________________________________________________________________
conv2d_199 (Conv2D) (None, 8, 8, 256) 172032 activation_198[0][0]
__________________________________________________________________________________________________
batch_normalization_196 (BatchN (None, 8, 8, 192) 576 conv2d_196[0][0]
__________________________________________________________________________________________________
batch_normalization_199 (BatchN (None, 8, 8, 256) 768 conv2d_199[0][0]
__________________________________________________________________________________________________
activation_196 (Activation) (None, 8, 8, 192) 0 batch_normalization_196[0][0]
__________________________________________________________________________________________________
activation_199 (Activation) (None, 8, 8, 256) 0 batch_normalization_199[0][0]
__________________________________________________________________________________________________
block8_9_mixed (Concatenate) (None, 8, 8, 448) 0 activation_196[0][0]
activation_199[0][0]
__________________________________________________________________________________________________
block8_9_conv (Conv2D) (None, 8, 8, 2080) 933920 block8_9_mixed[0][0]
__________________________________________________________________________________________________
block8_9 (Lambda) (None, 8, 8, 2080) 0 block8_8_ac[0][0]
block8_9_conv[0][0]
__________________________________________________________________________________________________
block8_9_ac (Activation) (None, 8, 8, 2080) 0 block8_9[0][0]
__________________________________________________________________________________________________
conv2d_201 (Conv2D) (None, 8, 8, 192) 399360 block8_9_ac[0][0]
__________________________________________________________________________________________________
batch_normalization_201 (BatchN (None, 8, 8, 192) 576 conv2d_201[0][0]
__________________________________________________________________________________________________
activation_201 (Activation) (None, 8, 8, 192) 0 batch_normalization_201[0][0]
__________________________________________________________________________________________________
conv2d_202 (Conv2D) (None, 8, 8, 224) 129024 activation_201[0][0]
__________________________________________________________________________________________________
batch_normalization_202 (BatchN (None, 8, 8, 224) 672 conv2d_202[0][0]
__________________________________________________________________________________________________
activation_202 (Activation) (None, 8, 8, 224) 0 batch_normalization_202[0][0]
__________________________________________________________________________________________________
conv2d_200 (Conv2D) (None, 8, 8, 192) 399360 block8_9_ac[0][0]
__________________________________________________________________________________________________
conv2d_203 (Conv2D) (None, 8, 8, 256) 172032 activation_202[0][0]
__________________________________________________________________________________________________
batch_normalization_200 (BatchN (None, 8, 8, 192) 576 conv2d_200[0][0]
__________________________________________________________________________________________________
batch_normalization_203 (BatchN (None, 8, 8, 256) 768 conv2d_203[0][0]
__________________________________________________________________________________________________
activation_200 (Activation) (None, 8, 8, 192) 0 batch_normalization_200[0][0]
__________________________________________________________________________________________________
activation_203 (Activation) (None, 8, 8, 256) 0 batch_normalization_203[0][0]
__________________________________________________________________________________________________
block8_10_mixed (Concatenate) (None, 8, 8, 448) 0 activation_200[0][0]
activation_203[0][0]
__________________________________________________________________________________________________
block8_10_conv (Conv2D) (None, 8, 8, 2080) 933920 block8_10_mixed[0][0]
__________________________________________________________________________________________________
block8_10 (Lambda) (None, 8, 8, 2080) 0 block8_9_ac[0][0]
block8_10_conv[0][0]
__________________________________________________________________________________________________
conv_7b (Conv2D) (None, 8, 8, 1536) 3194880 block8_10[0][0]
__________________________________________________________________________________________________
conv_7b_bn (BatchNormalization) (None, 8, 8, 1536) 4608 conv_7b[0][0]
__________________________________________________________________________________________________
conv_7b_ac (Activation) (None, 8, 8, 1536) 0 conv_7b_bn[0][0]
__________________________________________________________________________________________________
avg_pool (GlobalAveragePooling2 (None, 1536) 0 conv_7b_ac[0][0]
__________________________________________________________________________________________________
predictions (Dense) (None, 1000) 1537000 avg_pool[0][0]
==================================================================================================
Total params: 55,873,736
Trainable params: 55,813,192
Non-trainable params: 60,544
__________________________________________________________________________________________________
In [45]:
! apt-get install -y graphviz libgraphviz-dev && pip3 install pydot graphviz
Reading package lists... Done
Building dependency tree
Reading state information... Done
graphviz is already the newest version (2.40.1-2).
libgraphviz-dev is already the newest version (2.40.1-2).
0 upgraded, 0 newly installed, 0 to remove and 8 not upgraded.
Requirement already satisfied: pydot in /usr/local/lib/python3.6/dist-packages (1.3.0)
Requirement already satisfied: graphviz in /usr/local/lib/python3.6/dist-packages (0.10.1)
Requirement already satisfied: pyparsing>=2.1.4 in /usr/local/lib/python3.6/dist-packages (from pydot) (2.3.1)
In [0]:
from keras.utils import plot_model
import pydot
import graphviz # apt-get install -y graphviz libgraphviz-dev && pip3 install pydot graphviz
from IPython.display import SVG
from keras.utils.vis_utils import model_to_dot
In [47]:
output_dir = './'
plot_model(model_InceptionResNetV2, to_file= output_dir + '/model_summary_plot.png')
SVG(model_to_dot(model_InceptionResNetV2).create(prog='dot', format='svg'))
Out[47]:
In [0]:
img = image.load_img(img_path, target_size=(299, 299))
In [0]:
x = image.img_to_array(img)
x = np.expand_dims(x, axis=0)
In [0]:
from keras.applications.inception_resnet_v2 import preprocess_input as PRE_PROCESSOR
import pandas as pd
from keras.applications.inception_resnet_v2 import decode_predictions as LABEL_DECODER
In [51]:
output = class_activation_map(INPUT_IMG_FILE,
PRE_PROCESSOR=PRE_PROCESSOR,
LABEL_DECODER=LABEL_DECODER,
MODEL=model_InceptionResNetV2,
LABELS=None,
IM_WIDTH=299,
IM_HEIGHT=299,
EVAL_STEPS=1,
URL_MODE=False,
FILE_MODE=False,
HEATMAP_SHAPE=[8,8])
PREDICTION: hummingbird
WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.cast instead.
Completed processing 1 out of 1 steps in 9.6144540309906 seconds ...
In [52]:
HEATMAP = output[0]
LABEL = output[3]
plt.matshow(HEATMAP)
plt.show()
print (LABEL)
category probability
0 hummingbird 0.973336
1 water_ouzel 0.000573
2 jacamar 0.000501
In [0]:
heatmap_output = heatmap_overlay(INPUT_IMG_FILE,
HEATMAP,
THRESHOLD=0.8)
superimposed_img = heatmap_output[0]
In [0]:
output_file = './class_activation_map_InceptionResNetV2.jpeg'
cv2.imwrite(output_file, superimposed_img)
img=mpimg.imread(output_file)
In [55]:
plt.imshow(img)
Out[55]:
<matplotlib.image.AxesImage at 0x7fe25e64c978>
In [59]:
! wget https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/banana_01.jpg -O banana_01.jpg
--2019-02-21 09:30:51-- https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/banana_01.jpg
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 80697 (79K) [image/jpeg]
Saving to: ‘banana_01.jpg’
banana_01.jpg 100%[===================>] 78.81K --.-KB/s in 0.01s
2019-02-21 09:30:51 (7.33 MB/s) - ‘banana_01.jpg’ saved [80697/80697]
In [0]:
INPUT_IMG_FILE = './banana_01.jpg'
In [61]:
img=mpimg.imread(INPUT_IMG_FILE)
plt.imshow(img)
Out[61]:
<matplotlib.image.AxesImage at 0x7fe25e126320>
In [62]:
output = class_activation_map(INPUT_IMG_FILE,
PRE_PROCESSOR=PRE_PROCESSOR,
LABEL_DECODER=LABEL_DECODER,
MODEL=model_InceptionResNetV2,
LABELS=None,
IM_WIDTH=299,
IM_HEIGHT=299,
EVAL_STEPS=1,
URL_MODE=False,
FILE_MODE=False,
HEATMAP_SHAPE=[8,8])
PREDICTION: banana
Completed processing 1 out of 1 steps in 4.7812793254852295 seconds ...
In [63]:
HEATMAP = output[0]
LABEL = output[3]
plt.matshow(HEATMAP)
plt.show()
print (LABEL)
category probability
0 banana 0.868550
1 orange 0.006009
2 grocery_store 0.001587
In [0]:
heatmap_output = heatmap_overlay(INPUT_IMG_FILE,
HEATMAP,
THRESHOLD=0.8)
superimposed_img = heatmap_output[0]
In [0]:
output_file = './class_activation_map_InceptionResNetV2.jpeg'
cv2.imwrite(output_file, superimposed_img)
img=mpimg.imread(output_file)
In [66]:
plt.imshow(img)
Out[66]:
<matplotlib.image.AxesImage at 0x7fe25df8cb38>
In [67]:
import seaborn as sns
f = sns.barplot(x='probability',y='category',data=LABEL,color="red")
sns.set_style(style='white')
f.grid(False)
f.spines["top"].set_visible(False)
f.spines["right"].set_visible(False)
f.spines["bottom"].set_visible(False)
f.spines["left"].set_visible(False)
f.set_title('Top 3 Predictions:')
/usr/local/lib/python3.6/dist-packages/seaborn/categorical.py:1428: FutureWarning: remove_na is deprecated and is a private function. Do not use.
stat_data = remove_na(group_data)
Out[67]:
Text(0.5, 1.0, 'Top 3 Predictions:')
In [68]:
! wget https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/banjo_player_01.jpg -O banjo_player_01.jpg
--2019-02-21 09:31:30-- https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/banjo_player_01.jpg
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 7797 (7.6K) [image/jpeg]
Saving to: ‘banjo_player_01.jpg’
banjo_player_01.jpg 100%[===================>] 7.61K --.-KB/s in 0s
2019-02-21 09:31:30 (79.7 MB/s) - ‘banjo_player_01.jpg’ saved [7797/7797]
In [0]:
INPUT_IMG_FILE = './banjo_player_01.jpg'
In [70]:
img=mpimg.imread(INPUT_IMG_FILE)
plt.imshow(img)
Out[70]:
<matplotlib.image.AxesImage at 0x7fe25def0b00>
In [71]:
output = class_activation_map(INPUT_IMG_FILE,
PRE_PROCESSOR=PRE_PROCESSOR,
LABEL_DECODER=LABEL_DECODER,
MODEL=model_InceptionResNetV2,
LABELS=None,
IM_WIDTH=299,
IM_HEIGHT=299,
EVAL_STEPS=1,
URL_MODE=False,
FILE_MODE=False,
HEATMAP_SHAPE=[8,8])
PREDICTION: banjo
Completed processing 1 out of 1 steps in 4.5725181102752686 seconds ...
In [72]:
HEATMAP = output[0]
LABEL = output[3]
plt.matshow(HEATMAP)
plt.show()
print (LABEL)
category probability
0 banjo 0.933209
1 toilet_seat 0.000398
2 drum 0.000398
In [0]:
heatmap_output = heatmap_overlay(INPUT_IMG_FILE,
HEATMAP,
THRESHOLD=0.8)
superimposed_img = heatmap_output[0]
In [0]:
output_file = './class_activation_map_InceptionResNetV2.jpeg'
cv2.imwrite(output_file, superimposed_img)
img=mpimg.imread(output_file)
In [75]:
plt.imshow(img)
Out[75]:
<matplotlib.image.AxesImage at 0x7fe25dd77860>
In [76]:
import seaborn as sns
f = sns.barplot(x='probability',y='category',data=LABEL,color="red")
sns.set_style(style='white')
f.grid(False)
f.spines["top"].set_visible(False)
f.spines["right"].set_visible(False)
f.spines["bottom"].set_visible(False)
f.spines["left"].set_visible(False)
f.set_title('Top 3 Predictions:')
/usr/local/lib/python3.6/dist-packages/seaborn/categorical.py:1428: FutureWarning: remove_na is deprecated and is a private function. Do not use.
stat_data = remove_na(group_data)
Out[76]:
Text(0.5, 1.0, 'Top 3 Predictions:')
In [77]:
! wget https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/banjo_player_02.jpg -O banjo_player_02.jpg
--2019-02-21 09:32:04-- https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/banjo_player_02.jpg
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 1073613 (1.0M) [image/jpeg]
Saving to: ‘banjo_player_02.jpg’
banjo_player_02.jpg 100%[===================>] 1.02M --.-KB/s in 0.03s
2019-02-21 09:32:04 (29.8 MB/s) - ‘banjo_player_02.jpg’ saved [1073613/1073613]
In [0]:
INPUT_IMG_FILE = './banjo_player_02.jpg'
In [79]:
img=mpimg.imread(INPUT_IMG_FILE)
plt.imshow(img)
Out[79]:
<matplotlib.image.AxesImage at 0x7fe25dca3eb8>
In [80]:
output = class_activation_map(INPUT_IMG_FILE,
PRE_PROCESSOR=PRE_PROCESSOR,
LABEL_DECODER=LABEL_DECODER,
MODEL=model_InceptionResNetV2,
LABELS=None,
IM_WIDTH=299,
IM_HEIGHT=299,
EVAL_STEPS=1,
URL_MODE=False,
FILE_MODE=False,
HEATMAP_SHAPE=[8,8])
PREDICTION: banjo
Completed processing 1 out of 1 steps in 4.621196985244751 seconds ...
In [81]:
HEATMAP = output[0]
LABEL = output[3]
plt.matshow(HEATMAP)
plt.show()
print (LABEL)
category probability
0 banjo 0.956864
1 toilet_seat 0.000349
2 strainer 0.000300
In [0]:
heatmap_output = heatmap_overlay(INPUT_IMG_FILE,
HEATMAP,
THRESHOLD=0.8)
superimposed_img = heatmap_output[0]
In [0]:
output_file = './class_activation_map_InceptionResNetV2.jpeg'
cv2.imwrite(output_file, superimposed_img)
img=mpimg.imread(output_file)
In [84]:
plt.imshow(img)
Out[84]:
<matplotlib.image.AxesImage at 0x7fe25dae4cf8>
In [85]:
import seaborn as sns
f = sns.barplot(x='probability',y='category',data=LABEL,color="red")
sns.set_style(style='white')
f.grid(False)
f.spines["top"].set_visible(False)
f.spines["right"].set_visible(False)
f.spines["bottom"].set_visible(False)
f.spines["left"].set_visible(False)
f.set_title('Top 3 Predictions:')
/usr/local/lib/python3.6/dist-packages/seaborn/categorical.py:1428: FutureWarning: remove_na is deprecated and is a private function. Do not use.
stat_data = remove_na(group_data)
Out[85]:
Text(0.5, 1.0, 'Top 3 Predictions:')
In [86]:
! wget https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/throne_01.jpg -O throne_01.jpg
--2019-02-21 09:32:38-- https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/throne_01.jpg
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 768540 (751K) [image/jpeg]
Saving to: ‘throne_01.jpg’
throne_01.jpg 100%[===================>] 750.53K --.-KB/s in 0.03s
2019-02-21 09:32:38 (26.0 MB/s) - ‘throne_01.jpg’ saved [768540/768540]
In [0]:
INPUT_IMG_FILE = './throne_01.jpg'
In [88]:
img=mpimg.imread(INPUT_IMG_FILE)
plt.imshow(img)
Out[88]:
<matplotlib.image.AxesImage at 0x7fe25da19390>
In [89]:
output = class_activation_map(INPUT_IMG_FILE,
PRE_PROCESSOR=PRE_PROCESSOR,
LABEL_DECODER=LABEL_DECODER,
MODEL=model_InceptionResNetV2,
LABELS=None,
IM_WIDTH=299,
IM_HEIGHT=299,
EVAL_STEPS=1,
URL_MODE=False,
FILE_MODE=False,
HEATMAP_SHAPE=[8,8])
PREDICTION: throne
Completed processing 1 out of 1 steps in 4.443957090377808 seconds ...
In [90]:
HEATMAP = output[0]
LABEL = output[3]
plt.matshow(HEATMAP)
plt.show()
print (LABEL)
category probability
0 throne 0.901864
1 four-poster 0.014415
2 altar 0.002789
In [0]:
heatmap_output = heatmap_overlay(INPUT_IMG_FILE,
HEATMAP,
THRESHOLD=0.8)
superimposed_img = heatmap_output[0]
In [0]:
output_file = './class_activation_map_InceptionResNetV2.jpeg'
cv2.imwrite(output_file, superimposed_img)
img=mpimg.imread(output_file)
In [93]:
plt.imshow(img)
Out[93]:
<matplotlib.image.AxesImage at 0x7fe25d8a0e48>
Content source: rahulremanan/python_tutorial
Similar notebooks: