Class activation maps (CAM) is one of many ways to visualize and get insights from a convolutional neural network (CNN). In this visualization approach, a "Class Activation" heatmap is created over the input image. A "class activation" heatmap is a 2D grid of scores associated with an specific output class, computed for every location in any input image, indicating how important each location is with respect to the class considered. It is an easy mechanism to tell an observer which features the CNN model is looking for, while generating the predictions.

This notebook uses Keras and Tensorflow.

The visual summary of features using CAM is really useful for creating better explainable deep-learning models, and a lot of interesting insights can be derived from a CNN model. Some of these interesting examples are shown towards the end of this notebook.

Part 1 -- Visualizing the VGG16 model using CAM:


In [1]:
from keras.applications.vgg16 import VGG16
import matplotlib.image as mpimg
from keras import backend as K
import matplotlib.pyplot as plt
%matplotlib inline
K.clear_session()


Using TensorFlow backend.

In [2]:
model_vgg16 = VGG16(weights='imagenet')


WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Colocations handled automatically by placer.

In [3]:
! wget https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/hummingbird_01.jpg -O hummingbird_01.jpg


--2019-02-21 09:28:41--  https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/hummingbird_01.jpg
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 89448 (87K) [image/jpeg]
Saving to: ‘hummingbird_01.jpg’

hummingbird_01.jpg  100%[===================>]  87.35K  --.-KB/s    in 0.01s   

2019-02-21 09:28:41 (7.92 MB/s) - ‘hummingbird_01.jpg’ saved [89448/89448]

Sample Image


In [4]:
img_path = './hummingbird_01.jpg'
img=mpimg.imread(img_path)
plt.imshow(img)


Out[4]:
<matplotlib.image.AxesImage at 0x7fe29017cf60>

Resizing image to fit the input size of VGG


In [0]:
from keras.preprocessing import image
img = image.load_img(img_path, target_size=(224, 224))

Convert to numpy array


In [0]:
x = image.img_to_array(img)

Reshape data so that it is in "batch" form because the model only accepts input in this form


In [0]:
import numpy as np
x = np.expand_dims(x, axis=0)

"Batch" form


In [8]:
x.shape


Out[8]:
(1, 224, 224, 3)

Preprocessing


In [0]:
from keras.applications.vgg16 import preprocess_input
x = preprocess_input(x)

Prediction


In [10]:
import pandas as pd
from keras.applications.vgg16 import decode_predictions
preds = model_vgg16.predict(x)
predictions = pd.DataFrame(decode_predictions(preds, top=3)[0],columns=['col1','category','probability']).iloc[:,1:]
print('PREDICTION:',predictions.loc[0,'category'])


PREDICTION: hummingbird

In [11]:
import seaborn as sns
f = sns.barplot(x='probability',y='category',data=predictions,color="red")
sns.set_style(style='white')
f.grid(False)
f.spines["top"].set_visible(False)
f.spines["right"].set_visible(False)
f.spines["bottom"].set_visible(False)
f.spines["left"].set_visible(False)
f.set_title('Top 3 Predictions:')


/usr/local/lib/python3.6/dist-packages/seaborn/categorical.py:1428: FutureWarning: remove_na is deprecated and is a private function. Do not use.
  stat_data = remove_na(group_data)
Out[11]:
Text(0.5, 1.0, 'Top 3 Predictions:')

Index of the prediction


In [0]:
argmax = np.argmax(preds[0])

Get the index of the prediction


In [0]:
output = model_vgg16.output[:, argmax]

Model Archtecture


In [14]:
model_vgg16.summary()


_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_1 (InputLayer)         (None, 224, 224, 3)       0         
_________________________________________________________________
block1_conv1 (Conv2D)        (None, 224, 224, 64)      1792      
_________________________________________________________________
block1_conv2 (Conv2D)        (None, 224, 224, 64)      36928     
_________________________________________________________________
block1_pool (MaxPooling2D)   (None, 112, 112, 64)      0         
_________________________________________________________________
block2_conv1 (Conv2D)        (None, 112, 112, 128)     73856     
_________________________________________________________________
block2_conv2 (Conv2D)        (None, 112, 112, 128)     147584    
_________________________________________________________________
block2_pool (MaxPooling2D)   (None, 56, 56, 128)       0         
_________________________________________________________________
block3_conv1 (Conv2D)        (None, 56, 56, 256)       295168    
_________________________________________________________________
block3_conv2 (Conv2D)        (None, 56, 56, 256)       590080    
_________________________________________________________________
block3_conv3 (Conv2D)        (None, 56, 56, 256)       590080    
_________________________________________________________________
block3_pool (MaxPooling2D)   (None, 28, 28, 256)       0         
_________________________________________________________________
block4_conv1 (Conv2D)        (None, 28, 28, 512)       1180160   
_________________________________________________________________
block4_conv2 (Conv2D)        (None, 28, 28, 512)       2359808   
_________________________________________________________________
block4_conv3 (Conv2D)        (None, 28, 28, 512)       2359808   
_________________________________________________________________
block4_pool (MaxPooling2D)   (None, 14, 14, 512)       0         
_________________________________________________________________
block5_conv1 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_conv2 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_conv3 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_pool (MaxPooling2D)   (None, 7, 7, 512)         0         
_________________________________________________________________
flatten (Flatten)            (None, 25088)             0         
_________________________________________________________________
fc1 (Dense)                  (None, 4096)              102764544 
_________________________________________________________________
fc2 (Dense)                  (None, 4096)              16781312  
_________________________________________________________________
predictions (Dense)          (None, 1000)              4097000   
=================================================================
Total params: 138,357,544
Trainable params: 138,357,544
Non-trainable params: 0
_________________________________________________________________

We want to get the final Convolutional Layer


In [0]:
last_conv_layer = model_vgg16.get_layer('block5_conv3')

Get the Gradient


In [0]:
grads = K.gradients(output, last_conv_layer.output)[0]

Each entry of this tensor is the mean intensity of the gradient over a specific feature map channel. This has a shape of (512,)


In [0]:
pooled_grads = K.mean(grads, axis=(0, 1, 2))

Access the values of the quantities we just defined


In [0]:
iterate = K.function([model_vgg16.input], [pooled_grads, last_conv_layer.output[0]])

These are the values of these two quantities, as Numpy arrays, given our sample image of two elephants


In [0]:
pooled_grads_value, conv_layer_output_value = iterate([x])

We multiply each channel in the feature map array by "how important this channel is" with regard to the prediction class


In [0]:
for i in range(conv_layer_output_value.shape[2]):
    conv_layer_output_value[:, :, i] *= pooled_grads_value[i]

Plotting the Heatmap


In [21]:
heatmap = np.mean(conv_layer_output_value, axis=-1)
heatmap = np.maximum(heatmap, 0)
heatmap /= np.max(heatmap)
plt.matshow(heatmap)
plt.show()


Load Image with CV2


In [0]:
import cv2
img = cv2.imread(img_path)

Resize the heatmap


In [0]:
heatmap = cv2.resize(heatmap, (img.shape[1], img.shape[0]))

Convert heatmap to RGB


In [0]:
heatmap = np.uint8(255 * heatmap)

Apply heatmap to original Image


In [0]:
heatmap = cv2.applyColorMap(heatmap, cv2.COLORMAP_JET)

Apply heatmap intensity factor


In [0]:
hif = .8

In [0]:
superimposed_img = heatmap * hif + img

Save to disk


In [0]:
output = './output.jpeg'
cv2.imwrite(output, superimposed_img)

img=mpimg.imread(output)

In [0]:
from google.colab import files
files.download(output)

Plot


In [30]:
plt.imshow(img)
plt.axis('off')
plt.title(predictions.loc[0,'category'])


Out[30]:
Text(0.5, 1.0, 'hummingbird')

Part 2 -- Implementing the CAM visualization as a function:


In [0]:
def class_activation_map(INPUT_IMG_FILE=None,
                         PRE_PROCESSOR=None,
                         LABEL_DECODER=None,
                         MODEL=None,
                         LABELS=None,
                         IM_WIDTH=299,
                         IM_HEIGHT=299,
                         CONV_LAYER='conv_7b',
                         URL_MODE=False,
                         FILE_MODE=True,
                         EVAL_STEPS=10,
                         HEATMAP_SHAPE=[14,14]):
  if INPUT_IMG_FILE == None:
    print ('No input file specified to generate predictions ...')
    return
  
  if URL_MODE:
    response = requests.get(INPUT_IMG_FILE)
    img = Image.open(BytesIO(response.content))
    img = img.resize((IM_WIDTH, IM_HEIGHT))
  elif FILE_MODE:
    img = INPUT_IMG_FILE
  else:
    img = image.load_img(INPUT_IMG_FILE, target_size=(IM_WIDTH, IM_HEIGHT))
    
  x = img
  
  if not FILE_MODE:
    x = image.img_to_array(img)
    x = np.expand_dims(x, axis=0)
    if PRE_PROCESSOR !=None:
      preprocess_input = PRE_PROCESSOR
      x = preprocess_input(x)
  
  model = MODEL
  if model == None:
    print ('No input model specified to generate predictions ...')
    return
  labels = LABELS
  
  heatmaps = []
  heatmap_sum = np.empty(HEATMAP_SHAPE, float)
  
  last_conv_layer = model.get_layer(CONV_LAYER)  
  feature_size = tensor_featureSizeExtractor(last_conv_layer)
  
  for step in (range(EVAL_STEPS)):
    start = time.time()
    
    preds = model.predict(x)  
  
    probability = preds.flatten()
    
    prediction = []
    
    if labels !=None:
      prediction = labels[np.argmax(probability)]
    elif LABEL_DECODER !=None:
      prediction = pd.DataFrame(LABEL_DECODER(preds, top=3)[0],columns=['col1','category','probability']).iloc[:,1:]
      print('PREDICTION:',prediction.loc[0,'category'])
    else:
      print ('No labels will be generated ...')
      
    accuracy = probability[np.argmax(probability)]
  
    argmax = np.argmax(preds[0])
  
    output = model.output[:, argmax]
  
    grads = K.gradients(output, last_conv_layer.output)[0]
    pooled_grads = K.mean(grads, axis=(0, 1, 2))
    iterate = K.function([model.input], [pooled_grads, last_conv_layer.output[0]])
    pooled_grads_value, conv_layer_output_value = iterate([x])
    
    for i in range(feature_size):
      conv_layer_output_value[:,:,i] *= pooled_grads_value[i]
    
    heatmap = np.mean(conv_layer_output_value, axis=-1)
    heatmap = np.maximum(heatmap, 0)
    heatmap /= np.max(heatmap)
    
    try:
      heatmap_sum = np.add(heatmap_sum, 
                           heatmap)
      heatmaps.append(heatmap)
      if EVAL_STEPS >1:
        del (heatmap)
    except:
      print ('Failed updating heatmaps')
    
    end = time.time()
    execution_time = end - start
    
    print ('Completed processing {} out of {} steps in {} seconds ...'.format(int(step+1), int(EVAL_STEPS), float(execution_time)))
    
  if EVAL_STEPS >1:
    mean_heatmap = heatmap_sum/EVAL_STEPS
  else:
    mean_heatmap = heatmap
  
  return [mean_heatmap, heatmaps, preds[0], prediction, accuracy, probability]

In [0]:
def tensor_featureSizeExtractor(last_conv_layer):
  if len(last_conv_layer.output.get_shape().as_list()) == 4:
    feature_size = last_conv_layer.output.get_shape().as_list()[3]
    return feature_size
  else:
    return 'Received tensor shape: {} instead of expected shape: 4'.format(len(last_conv_layer.output.get_shape().as_list()))

In [0]:
INPUT_IMG_FILE = 'https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/hummingbird_01.jpg'
CONV_LAYER = 'block5_conv3'

In [0]:
from keras.applications.vgg16 import preprocess_input as PRE_PROCESSOR
import requests 
from PIL import Image
from io import BytesIO
import time

In [35]:
output = class_activation_map(INPUT_IMG_FILE,
                              PRE_PROCESSOR=PRE_PROCESSOR,
                              MODEL=model_vgg16,
                              LABELS=None,
                              IM_WIDTH=224,
                              IM_HEIGHT=224,
                              CONV_LAYER=CONV_LAYER,
                              EVAL_STEPS=1,
                              URL_MODE=True,
                              FILE_MODE=False)


No labels will be generated ...
Completed processing 1 out of 1 steps in 0.15799641609191895 seconds ...

In [36]:
HEATMAP = output[0]

plt.matshow(HEATMAP)
plt.show()



In [0]:
def heatmap_overlay(INPUT_IMG_FILE,
                    HEATMAP,
                    THRESHOLD=0.8):
  img = cv2.imread(INPUT_IMG_FILE)
  
  heatmap = cv2.resize(HEATMAP, (img.shape[1], img.shape[0]))
  heatmap = np.uint8(255 * heatmap)
  heatmap = cv2.applyColorMap(heatmap, cv2.COLORMAP_JET)
  hif = THRESHOLD
  superimposed_img = heatmap * hif + img
  return [superimposed_img, heatmap]

In [0]:
INPUT_IMG_FILE = './hummingbird_01.jpg'

In [0]:
heatmap_output = heatmap_overlay(INPUT_IMG_FILE,
                                 HEATMAP,
                                 THRESHOLD=0.8)
superimposed_img = heatmap_output[0]

In [0]:
output_file = './class_activation_map.jpeg'
cv2.imwrite(output_file, superimposed_img)

img=mpimg.imread(output_file)

In [41]:
plt.imshow(img)


Out[41]:
<matplotlib.image.AxesImage at 0x7fe271c58cc0>

Part 3 -- Extend the CAM visualization for Inception-ResNet version 2:


In [0]:
from keras.applications.inception_resnet_v2 import InceptionResNetV2

%matplotlib inline
K.clear_session()

In [0]:
model_InceptionResNetV2 = InceptionResNetV2(weights='imagenet')

Generating Inception-ResNet version 2 summary


In [44]:
model_InceptionResNetV2.summary()


__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_1 (InputLayer)            (None, 299, 299, 3)  0                                            
__________________________________________________________________________________________________
conv2d_1 (Conv2D)               (None, 149, 149, 32) 864         input_1[0][0]                    
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, 149, 149, 32) 96          conv2d_1[0][0]                   
__________________________________________________________________________________________________
activation_1 (Activation)       (None, 149, 149, 32) 0           batch_normalization_1[0][0]      
__________________________________________________________________________________________________
conv2d_2 (Conv2D)               (None, 147, 147, 32) 9216        activation_1[0][0]               
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, 147, 147, 32) 96          conv2d_2[0][0]                   
__________________________________________________________________________________________________
activation_2 (Activation)       (None, 147, 147, 32) 0           batch_normalization_2[0][0]      
__________________________________________________________________________________________________
conv2d_3 (Conv2D)               (None, 147, 147, 64) 18432       activation_2[0][0]               
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, 147, 147, 64) 192         conv2d_3[0][0]                   
__________________________________________________________________________________________________
activation_3 (Activation)       (None, 147, 147, 64) 0           batch_normalization_3[0][0]      
__________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D)  (None, 73, 73, 64)   0           activation_3[0][0]               
__________________________________________________________________________________________________
conv2d_4 (Conv2D)               (None, 73, 73, 80)   5120        max_pooling2d_1[0][0]            
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, 73, 73, 80)   240         conv2d_4[0][0]                   
__________________________________________________________________________________________________
activation_4 (Activation)       (None, 73, 73, 80)   0           batch_normalization_4[0][0]      
__________________________________________________________________________________________________
conv2d_5 (Conv2D)               (None, 71, 71, 192)  138240      activation_4[0][0]               
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, 71, 71, 192)  576         conv2d_5[0][0]                   
__________________________________________________________________________________________________
activation_5 (Activation)       (None, 71, 71, 192)  0           batch_normalization_5[0][0]      
__________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D)  (None, 35, 35, 192)  0           activation_5[0][0]               
__________________________________________________________________________________________________
conv2d_9 (Conv2D)               (None, 35, 35, 64)   12288       max_pooling2d_2[0][0]            
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, 35, 35, 64)   192         conv2d_9[0][0]                   
__________________________________________________________________________________________________
activation_9 (Activation)       (None, 35, 35, 64)   0           batch_normalization_9[0][0]      
__________________________________________________________________________________________________
conv2d_7 (Conv2D)               (None, 35, 35, 48)   9216        max_pooling2d_2[0][0]            
__________________________________________________________________________________________________
conv2d_10 (Conv2D)              (None, 35, 35, 96)   55296       activation_9[0][0]               
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, 35, 35, 48)   144         conv2d_7[0][0]                   
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, 35, 35, 96)   288         conv2d_10[0][0]                  
__________________________________________________________________________________________________
activation_7 (Activation)       (None, 35, 35, 48)   0           batch_normalization_7[0][0]      
__________________________________________________________________________________________________
activation_10 (Activation)      (None, 35, 35, 96)   0           batch_normalization_10[0][0]     
__________________________________________________________________________________________________
average_pooling2d_1 (AveragePoo (None, 35, 35, 192)  0           max_pooling2d_2[0][0]            
__________________________________________________________________________________________________
conv2d_6 (Conv2D)               (None, 35, 35, 96)   18432       max_pooling2d_2[0][0]            
__________________________________________________________________________________________________
conv2d_8 (Conv2D)               (None, 35, 35, 64)   76800       activation_7[0][0]               
__________________________________________________________________________________________________
conv2d_11 (Conv2D)              (None, 35, 35, 96)   82944       activation_10[0][0]              
__________________________________________________________________________________________________
conv2d_12 (Conv2D)              (None, 35, 35, 64)   12288       average_pooling2d_1[0][0]        
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, 35, 35, 96)   288         conv2d_6[0][0]                   
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, 35, 35, 64)   192         conv2d_8[0][0]                   
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, 35, 35, 96)   288         conv2d_11[0][0]                  
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, 35, 35, 64)   192         conv2d_12[0][0]                  
__________________________________________________________________________________________________
activation_6 (Activation)       (None, 35, 35, 96)   0           batch_normalization_6[0][0]      
__________________________________________________________________________________________________
activation_8 (Activation)       (None, 35, 35, 64)   0           batch_normalization_8[0][0]      
__________________________________________________________________________________________________
activation_11 (Activation)      (None, 35, 35, 96)   0           batch_normalization_11[0][0]     
__________________________________________________________________________________________________
activation_12 (Activation)      (None, 35, 35, 64)   0           batch_normalization_12[0][0]     
__________________________________________________________________________________________________
mixed_5b (Concatenate)          (None, 35, 35, 320)  0           activation_6[0][0]               
                                                                 activation_8[0][0]               
                                                                 activation_11[0][0]              
                                                                 activation_12[0][0]              
__________________________________________________________________________________________________
conv2d_16 (Conv2D)              (None, 35, 35, 32)   10240       mixed_5b[0][0]                   
__________________________________________________________________________________________________
batch_normalization_16 (BatchNo (None, 35, 35, 32)   96          conv2d_16[0][0]                  
__________________________________________________________________________________________________
activation_16 (Activation)      (None, 35, 35, 32)   0           batch_normalization_16[0][0]     
__________________________________________________________________________________________________
conv2d_14 (Conv2D)              (None, 35, 35, 32)   10240       mixed_5b[0][0]                   
__________________________________________________________________________________________________
conv2d_17 (Conv2D)              (None, 35, 35, 48)   13824       activation_16[0][0]              
__________________________________________________________________________________________________
batch_normalization_14 (BatchNo (None, 35, 35, 32)   96          conv2d_14[0][0]                  
__________________________________________________________________________________________________
batch_normalization_17 (BatchNo (None, 35, 35, 48)   144         conv2d_17[0][0]                  
__________________________________________________________________________________________________
activation_14 (Activation)      (None, 35, 35, 32)   0           batch_normalization_14[0][0]     
__________________________________________________________________________________________________
activation_17 (Activation)      (None, 35, 35, 48)   0           batch_normalization_17[0][0]     
__________________________________________________________________________________________________
conv2d_13 (Conv2D)              (None, 35, 35, 32)   10240       mixed_5b[0][0]                   
__________________________________________________________________________________________________
conv2d_15 (Conv2D)              (None, 35, 35, 32)   9216        activation_14[0][0]              
__________________________________________________________________________________________________
conv2d_18 (Conv2D)              (None, 35, 35, 64)   27648       activation_17[0][0]              
__________________________________________________________________________________________________
batch_normalization_13 (BatchNo (None, 35, 35, 32)   96          conv2d_13[0][0]                  
__________________________________________________________________________________________________
batch_normalization_15 (BatchNo (None, 35, 35, 32)   96          conv2d_15[0][0]                  
__________________________________________________________________________________________________
batch_normalization_18 (BatchNo (None, 35, 35, 64)   192         conv2d_18[0][0]                  
__________________________________________________________________________________________________
activation_13 (Activation)      (None, 35, 35, 32)   0           batch_normalization_13[0][0]     
__________________________________________________________________________________________________
activation_15 (Activation)      (None, 35, 35, 32)   0           batch_normalization_15[0][0]     
__________________________________________________________________________________________________
activation_18 (Activation)      (None, 35, 35, 64)   0           batch_normalization_18[0][0]     
__________________________________________________________________________________________________
block35_1_mixed (Concatenate)   (None, 35, 35, 128)  0           activation_13[0][0]              
                                                                 activation_15[0][0]              
                                                                 activation_18[0][0]              
__________________________________________________________________________________________________
block35_1_conv (Conv2D)         (None, 35, 35, 320)  41280       block35_1_mixed[0][0]            
__________________________________________________________________________________________________
block35_1 (Lambda)              (None, 35, 35, 320)  0           mixed_5b[0][0]                   
                                                                 block35_1_conv[0][0]             
__________________________________________________________________________________________________
block35_1_ac (Activation)       (None, 35, 35, 320)  0           block35_1[0][0]                  
__________________________________________________________________________________________________
conv2d_22 (Conv2D)              (None, 35, 35, 32)   10240       block35_1_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, 35, 35, 32)   96          conv2d_22[0][0]                  
__________________________________________________________________________________________________
activation_22 (Activation)      (None, 35, 35, 32)   0           batch_normalization_22[0][0]     
__________________________________________________________________________________________________
conv2d_20 (Conv2D)              (None, 35, 35, 32)   10240       block35_1_ac[0][0]               
__________________________________________________________________________________________________
conv2d_23 (Conv2D)              (None, 35, 35, 48)   13824       activation_22[0][0]              
__________________________________________________________________________________________________
batch_normalization_20 (BatchNo (None, 35, 35, 32)   96          conv2d_20[0][0]                  
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, 35, 35, 48)   144         conv2d_23[0][0]                  
__________________________________________________________________________________________________
activation_20 (Activation)      (None, 35, 35, 32)   0           batch_normalization_20[0][0]     
__________________________________________________________________________________________________
activation_23 (Activation)      (None, 35, 35, 48)   0           batch_normalization_23[0][0]     
__________________________________________________________________________________________________
conv2d_19 (Conv2D)              (None, 35, 35, 32)   10240       block35_1_ac[0][0]               
__________________________________________________________________________________________________
conv2d_21 (Conv2D)              (None, 35, 35, 32)   9216        activation_20[0][0]              
__________________________________________________________________________________________________
conv2d_24 (Conv2D)              (None, 35, 35, 64)   27648       activation_23[0][0]              
__________________________________________________________________________________________________
batch_normalization_19 (BatchNo (None, 35, 35, 32)   96          conv2d_19[0][0]                  
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, 35, 35, 32)   96          conv2d_21[0][0]                  
__________________________________________________________________________________________________
batch_normalization_24 (BatchNo (None, 35, 35, 64)   192         conv2d_24[0][0]                  
__________________________________________________________________________________________________
activation_19 (Activation)      (None, 35, 35, 32)   0           batch_normalization_19[0][0]     
__________________________________________________________________________________________________
activation_21 (Activation)      (None, 35, 35, 32)   0           batch_normalization_21[0][0]     
__________________________________________________________________________________________________
activation_24 (Activation)      (None, 35, 35, 64)   0           batch_normalization_24[0][0]     
__________________________________________________________________________________________________
block35_2_mixed (Concatenate)   (None, 35, 35, 128)  0           activation_19[0][0]              
                                                                 activation_21[0][0]              
                                                                 activation_24[0][0]              
__________________________________________________________________________________________________
block35_2_conv (Conv2D)         (None, 35, 35, 320)  41280       block35_2_mixed[0][0]            
__________________________________________________________________________________________________
block35_2 (Lambda)              (None, 35, 35, 320)  0           block35_1_ac[0][0]               
                                                                 block35_2_conv[0][0]             
__________________________________________________________________________________________________
block35_2_ac (Activation)       (None, 35, 35, 320)  0           block35_2[0][0]                  
__________________________________________________________________________________________________
conv2d_28 (Conv2D)              (None, 35, 35, 32)   10240       block35_2_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_28 (BatchNo (None, 35, 35, 32)   96          conv2d_28[0][0]                  
__________________________________________________________________________________________________
activation_28 (Activation)      (None, 35, 35, 32)   0           batch_normalization_28[0][0]     
__________________________________________________________________________________________________
conv2d_26 (Conv2D)              (None, 35, 35, 32)   10240       block35_2_ac[0][0]               
__________________________________________________________________________________________________
conv2d_29 (Conv2D)              (None, 35, 35, 48)   13824       activation_28[0][0]              
__________________________________________________________________________________________________
batch_normalization_26 (BatchNo (None, 35, 35, 32)   96          conv2d_26[0][0]                  
__________________________________________________________________________________________________
batch_normalization_29 (BatchNo (None, 35, 35, 48)   144         conv2d_29[0][0]                  
__________________________________________________________________________________________________
activation_26 (Activation)      (None, 35, 35, 32)   0           batch_normalization_26[0][0]     
__________________________________________________________________________________________________
activation_29 (Activation)      (None, 35, 35, 48)   0           batch_normalization_29[0][0]     
__________________________________________________________________________________________________
conv2d_25 (Conv2D)              (None, 35, 35, 32)   10240       block35_2_ac[0][0]               
__________________________________________________________________________________________________
conv2d_27 (Conv2D)              (None, 35, 35, 32)   9216        activation_26[0][0]              
__________________________________________________________________________________________________
conv2d_30 (Conv2D)              (None, 35, 35, 64)   27648       activation_29[0][0]              
__________________________________________________________________________________________________
batch_normalization_25 (BatchNo (None, 35, 35, 32)   96          conv2d_25[0][0]                  
__________________________________________________________________________________________________
batch_normalization_27 (BatchNo (None, 35, 35, 32)   96          conv2d_27[0][0]                  
__________________________________________________________________________________________________
batch_normalization_30 (BatchNo (None, 35, 35, 64)   192         conv2d_30[0][0]                  
__________________________________________________________________________________________________
activation_25 (Activation)      (None, 35, 35, 32)   0           batch_normalization_25[0][0]     
__________________________________________________________________________________________________
activation_27 (Activation)      (None, 35, 35, 32)   0           batch_normalization_27[0][0]     
__________________________________________________________________________________________________
activation_30 (Activation)      (None, 35, 35, 64)   0           batch_normalization_30[0][0]     
__________________________________________________________________________________________________
block35_3_mixed (Concatenate)   (None, 35, 35, 128)  0           activation_25[0][0]              
                                                                 activation_27[0][0]              
                                                                 activation_30[0][0]              
__________________________________________________________________________________________________
block35_3_conv (Conv2D)         (None, 35, 35, 320)  41280       block35_3_mixed[0][0]            
__________________________________________________________________________________________________
block35_3 (Lambda)              (None, 35, 35, 320)  0           block35_2_ac[0][0]               
                                                                 block35_3_conv[0][0]             
__________________________________________________________________________________________________
block35_3_ac (Activation)       (None, 35, 35, 320)  0           block35_3[0][0]                  
__________________________________________________________________________________________________
conv2d_34 (Conv2D)              (None, 35, 35, 32)   10240       block35_3_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_34 (BatchNo (None, 35, 35, 32)   96          conv2d_34[0][0]                  
__________________________________________________________________________________________________
activation_34 (Activation)      (None, 35, 35, 32)   0           batch_normalization_34[0][0]     
__________________________________________________________________________________________________
conv2d_32 (Conv2D)              (None, 35, 35, 32)   10240       block35_3_ac[0][0]               
__________________________________________________________________________________________________
conv2d_35 (Conv2D)              (None, 35, 35, 48)   13824       activation_34[0][0]              
__________________________________________________________________________________________________
batch_normalization_32 (BatchNo (None, 35, 35, 32)   96          conv2d_32[0][0]                  
__________________________________________________________________________________________________
batch_normalization_35 (BatchNo (None, 35, 35, 48)   144         conv2d_35[0][0]                  
__________________________________________________________________________________________________
activation_32 (Activation)      (None, 35, 35, 32)   0           batch_normalization_32[0][0]     
__________________________________________________________________________________________________
activation_35 (Activation)      (None, 35, 35, 48)   0           batch_normalization_35[0][0]     
__________________________________________________________________________________________________
conv2d_31 (Conv2D)              (None, 35, 35, 32)   10240       block35_3_ac[0][0]               
__________________________________________________________________________________________________
conv2d_33 (Conv2D)              (None, 35, 35, 32)   9216        activation_32[0][0]              
__________________________________________________________________________________________________
conv2d_36 (Conv2D)              (None, 35, 35, 64)   27648       activation_35[0][0]              
__________________________________________________________________________________________________
batch_normalization_31 (BatchNo (None, 35, 35, 32)   96          conv2d_31[0][0]                  
__________________________________________________________________________________________________
batch_normalization_33 (BatchNo (None, 35, 35, 32)   96          conv2d_33[0][0]                  
__________________________________________________________________________________________________
batch_normalization_36 (BatchNo (None, 35, 35, 64)   192         conv2d_36[0][0]                  
__________________________________________________________________________________________________
activation_31 (Activation)      (None, 35, 35, 32)   0           batch_normalization_31[0][0]     
__________________________________________________________________________________________________
activation_33 (Activation)      (None, 35, 35, 32)   0           batch_normalization_33[0][0]     
__________________________________________________________________________________________________
activation_36 (Activation)      (None, 35, 35, 64)   0           batch_normalization_36[0][0]     
__________________________________________________________________________________________________
block35_4_mixed (Concatenate)   (None, 35, 35, 128)  0           activation_31[0][0]              
                                                                 activation_33[0][0]              
                                                                 activation_36[0][0]              
__________________________________________________________________________________________________
block35_4_conv (Conv2D)         (None, 35, 35, 320)  41280       block35_4_mixed[0][0]            
__________________________________________________________________________________________________
block35_4 (Lambda)              (None, 35, 35, 320)  0           block35_3_ac[0][0]               
                                                                 block35_4_conv[0][0]             
__________________________________________________________________________________________________
block35_4_ac (Activation)       (None, 35, 35, 320)  0           block35_4[0][0]                  
__________________________________________________________________________________________________
conv2d_40 (Conv2D)              (None, 35, 35, 32)   10240       block35_4_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_40 (BatchNo (None, 35, 35, 32)   96          conv2d_40[0][0]                  
__________________________________________________________________________________________________
activation_40 (Activation)      (None, 35, 35, 32)   0           batch_normalization_40[0][0]     
__________________________________________________________________________________________________
conv2d_38 (Conv2D)              (None, 35, 35, 32)   10240       block35_4_ac[0][0]               
__________________________________________________________________________________________________
conv2d_41 (Conv2D)              (None, 35, 35, 48)   13824       activation_40[0][0]              
__________________________________________________________________________________________________
batch_normalization_38 (BatchNo (None, 35, 35, 32)   96          conv2d_38[0][0]                  
__________________________________________________________________________________________________
batch_normalization_41 (BatchNo (None, 35, 35, 48)   144         conv2d_41[0][0]                  
__________________________________________________________________________________________________
activation_38 (Activation)      (None, 35, 35, 32)   0           batch_normalization_38[0][0]     
__________________________________________________________________________________________________
activation_41 (Activation)      (None, 35, 35, 48)   0           batch_normalization_41[0][0]     
__________________________________________________________________________________________________
conv2d_37 (Conv2D)              (None, 35, 35, 32)   10240       block35_4_ac[0][0]               
__________________________________________________________________________________________________
conv2d_39 (Conv2D)              (None, 35, 35, 32)   9216        activation_38[0][0]              
__________________________________________________________________________________________________
conv2d_42 (Conv2D)              (None, 35, 35, 64)   27648       activation_41[0][0]              
__________________________________________________________________________________________________
batch_normalization_37 (BatchNo (None, 35, 35, 32)   96          conv2d_37[0][0]                  
__________________________________________________________________________________________________
batch_normalization_39 (BatchNo (None, 35, 35, 32)   96          conv2d_39[0][0]                  
__________________________________________________________________________________________________
batch_normalization_42 (BatchNo (None, 35, 35, 64)   192         conv2d_42[0][0]                  
__________________________________________________________________________________________________
activation_37 (Activation)      (None, 35, 35, 32)   0           batch_normalization_37[0][0]     
__________________________________________________________________________________________________
activation_39 (Activation)      (None, 35, 35, 32)   0           batch_normalization_39[0][0]     
__________________________________________________________________________________________________
activation_42 (Activation)      (None, 35, 35, 64)   0           batch_normalization_42[0][0]     
__________________________________________________________________________________________________
block35_5_mixed (Concatenate)   (None, 35, 35, 128)  0           activation_37[0][0]              
                                                                 activation_39[0][0]              
                                                                 activation_42[0][0]              
__________________________________________________________________________________________________
block35_5_conv (Conv2D)         (None, 35, 35, 320)  41280       block35_5_mixed[0][0]            
__________________________________________________________________________________________________
block35_5 (Lambda)              (None, 35, 35, 320)  0           block35_4_ac[0][0]               
                                                                 block35_5_conv[0][0]             
__________________________________________________________________________________________________
block35_5_ac (Activation)       (None, 35, 35, 320)  0           block35_5[0][0]                  
__________________________________________________________________________________________________
conv2d_46 (Conv2D)              (None, 35, 35, 32)   10240       block35_5_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_46 (BatchNo (None, 35, 35, 32)   96          conv2d_46[0][0]                  
__________________________________________________________________________________________________
activation_46 (Activation)      (None, 35, 35, 32)   0           batch_normalization_46[0][0]     
__________________________________________________________________________________________________
conv2d_44 (Conv2D)              (None, 35, 35, 32)   10240       block35_5_ac[0][0]               
__________________________________________________________________________________________________
conv2d_47 (Conv2D)              (None, 35, 35, 48)   13824       activation_46[0][0]              
__________________________________________________________________________________________________
batch_normalization_44 (BatchNo (None, 35, 35, 32)   96          conv2d_44[0][0]                  
__________________________________________________________________________________________________
batch_normalization_47 (BatchNo (None, 35, 35, 48)   144         conv2d_47[0][0]                  
__________________________________________________________________________________________________
activation_44 (Activation)      (None, 35, 35, 32)   0           batch_normalization_44[0][0]     
__________________________________________________________________________________________________
activation_47 (Activation)      (None, 35, 35, 48)   0           batch_normalization_47[0][0]     
__________________________________________________________________________________________________
conv2d_43 (Conv2D)              (None, 35, 35, 32)   10240       block35_5_ac[0][0]               
__________________________________________________________________________________________________
conv2d_45 (Conv2D)              (None, 35, 35, 32)   9216        activation_44[0][0]              
__________________________________________________________________________________________________
conv2d_48 (Conv2D)              (None, 35, 35, 64)   27648       activation_47[0][0]              
__________________________________________________________________________________________________
batch_normalization_43 (BatchNo (None, 35, 35, 32)   96          conv2d_43[0][0]                  
__________________________________________________________________________________________________
batch_normalization_45 (BatchNo (None, 35, 35, 32)   96          conv2d_45[0][0]                  
__________________________________________________________________________________________________
batch_normalization_48 (BatchNo (None, 35, 35, 64)   192         conv2d_48[0][0]                  
__________________________________________________________________________________________________
activation_43 (Activation)      (None, 35, 35, 32)   0           batch_normalization_43[0][0]     
__________________________________________________________________________________________________
activation_45 (Activation)      (None, 35, 35, 32)   0           batch_normalization_45[0][0]     
__________________________________________________________________________________________________
activation_48 (Activation)      (None, 35, 35, 64)   0           batch_normalization_48[0][0]     
__________________________________________________________________________________________________
block35_6_mixed (Concatenate)   (None, 35, 35, 128)  0           activation_43[0][0]              
                                                                 activation_45[0][0]              
                                                                 activation_48[0][0]              
__________________________________________________________________________________________________
block35_6_conv (Conv2D)         (None, 35, 35, 320)  41280       block35_6_mixed[0][0]            
__________________________________________________________________________________________________
block35_6 (Lambda)              (None, 35, 35, 320)  0           block35_5_ac[0][0]               
                                                                 block35_6_conv[0][0]             
__________________________________________________________________________________________________
block35_6_ac (Activation)       (None, 35, 35, 320)  0           block35_6[0][0]                  
__________________________________________________________________________________________________
conv2d_52 (Conv2D)              (None, 35, 35, 32)   10240       block35_6_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_52 (BatchNo (None, 35, 35, 32)   96          conv2d_52[0][0]                  
__________________________________________________________________________________________________
activation_52 (Activation)      (None, 35, 35, 32)   0           batch_normalization_52[0][0]     
__________________________________________________________________________________________________
conv2d_50 (Conv2D)              (None, 35, 35, 32)   10240       block35_6_ac[0][0]               
__________________________________________________________________________________________________
conv2d_53 (Conv2D)              (None, 35, 35, 48)   13824       activation_52[0][0]              
__________________________________________________________________________________________________
batch_normalization_50 (BatchNo (None, 35, 35, 32)   96          conv2d_50[0][0]                  
__________________________________________________________________________________________________
batch_normalization_53 (BatchNo (None, 35, 35, 48)   144         conv2d_53[0][0]                  
__________________________________________________________________________________________________
activation_50 (Activation)      (None, 35, 35, 32)   0           batch_normalization_50[0][0]     
__________________________________________________________________________________________________
activation_53 (Activation)      (None, 35, 35, 48)   0           batch_normalization_53[0][0]     
__________________________________________________________________________________________________
conv2d_49 (Conv2D)              (None, 35, 35, 32)   10240       block35_6_ac[0][0]               
__________________________________________________________________________________________________
conv2d_51 (Conv2D)              (None, 35, 35, 32)   9216        activation_50[0][0]              
__________________________________________________________________________________________________
conv2d_54 (Conv2D)              (None, 35, 35, 64)   27648       activation_53[0][0]              
__________________________________________________________________________________________________
batch_normalization_49 (BatchNo (None, 35, 35, 32)   96          conv2d_49[0][0]                  
__________________________________________________________________________________________________
batch_normalization_51 (BatchNo (None, 35, 35, 32)   96          conv2d_51[0][0]                  
__________________________________________________________________________________________________
batch_normalization_54 (BatchNo (None, 35, 35, 64)   192         conv2d_54[0][0]                  
__________________________________________________________________________________________________
activation_49 (Activation)      (None, 35, 35, 32)   0           batch_normalization_49[0][0]     
__________________________________________________________________________________________________
activation_51 (Activation)      (None, 35, 35, 32)   0           batch_normalization_51[0][0]     
__________________________________________________________________________________________________
activation_54 (Activation)      (None, 35, 35, 64)   0           batch_normalization_54[0][0]     
__________________________________________________________________________________________________
block35_7_mixed (Concatenate)   (None, 35, 35, 128)  0           activation_49[0][0]              
                                                                 activation_51[0][0]              
                                                                 activation_54[0][0]              
__________________________________________________________________________________________________
block35_7_conv (Conv2D)         (None, 35, 35, 320)  41280       block35_7_mixed[0][0]            
__________________________________________________________________________________________________
block35_7 (Lambda)              (None, 35, 35, 320)  0           block35_6_ac[0][0]               
                                                                 block35_7_conv[0][0]             
__________________________________________________________________________________________________
block35_7_ac (Activation)       (None, 35, 35, 320)  0           block35_7[0][0]                  
__________________________________________________________________________________________________
conv2d_58 (Conv2D)              (None, 35, 35, 32)   10240       block35_7_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_58 (BatchNo (None, 35, 35, 32)   96          conv2d_58[0][0]                  
__________________________________________________________________________________________________
activation_58 (Activation)      (None, 35, 35, 32)   0           batch_normalization_58[0][0]     
__________________________________________________________________________________________________
conv2d_56 (Conv2D)              (None, 35, 35, 32)   10240       block35_7_ac[0][0]               
__________________________________________________________________________________________________
conv2d_59 (Conv2D)              (None, 35, 35, 48)   13824       activation_58[0][0]              
__________________________________________________________________________________________________
batch_normalization_56 (BatchNo (None, 35, 35, 32)   96          conv2d_56[0][0]                  
__________________________________________________________________________________________________
batch_normalization_59 (BatchNo (None, 35, 35, 48)   144         conv2d_59[0][0]                  
__________________________________________________________________________________________________
activation_56 (Activation)      (None, 35, 35, 32)   0           batch_normalization_56[0][0]     
__________________________________________________________________________________________________
activation_59 (Activation)      (None, 35, 35, 48)   0           batch_normalization_59[0][0]     
__________________________________________________________________________________________________
conv2d_55 (Conv2D)              (None, 35, 35, 32)   10240       block35_7_ac[0][0]               
__________________________________________________________________________________________________
conv2d_57 (Conv2D)              (None, 35, 35, 32)   9216        activation_56[0][0]              
__________________________________________________________________________________________________
conv2d_60 (Conv2D)              (None, 35, 35, 64)   27648       activation_59[0][0]              
__________________________________________________________________________________________________
batch_normalization_55 (BatchNo (None, 35, 35, 32)   96          conv2d_55[0][0]                  
__________________________________________________________________________________________________
batch_normalization_57 (BatchNo (None, 35, 35, 32)   96          conv2d_57[0][0]                  
__________________________________________________________________________________________________
batch_normalization_60 (BatchNo (None, 35, 35, 64)   192         conv2d_60[0][0]                  
__________________________________________________________________________________________________
activation_55 (Activation)      (None, 35, 35, 32)   0           batch_normalization_55[0][0]     
__________________________________________________________________________________________________
activation_57 (Activation)      (None, 35, 35, 32)   0           batch_normalization_57[0][0]     
__________________________________________________________________________________________________
activation_60 (Activation)      (None, 35, 35, 64)   0           batch_normalization_60[0][0]     
__________________________________________________________________________________________________
block35_8_mixed (Concatenate)   (None, 35, 35, 128)  0           activation_55[0][0]              
                                                                 activation_57[0][0]              
                                                                 activation_60[0][0]              
__________________________________________________________________________________________________
block35_8_conv (Conv2D)         (None, 35, 35, 320)  41280       block35_8_mixed[0][0]            
__________________________________________________________________________________________________
block35_8 (Lambda)              (None, 35, 35, 320)  0           block35_7_ac[0][0]               
                                                                 block35_8_conv[0][0]             
__________________________________________________________________________________________________
block35_8_ac (Activation)       (None, 35, 35, 320)  0           block35_8[0][0]                  
__________________________________________________________________________________________________
conv2d_64 (Conv2D)              (None, 35, 35, 32)   10240       block35_8_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_64 (BatchNo (None, 35, 35, 32)   96          conv2d_64[0][0]                  
__________________________________________________________________________________________________
activation_64 (Activation)      (None, 35, 35, 32)   0           batch_normalization_64[0][0]     
__________________________________________________________________________________________________
conv2d_62 (Conv2D)              (None, 35, 35, 32)   10240       block35_8_ac[0][0]               
__________________________________________________________________________________________________
conv2d_65 (Conv2D)              (None, 35, 35, 48)   13824       activation_64[0][0]              
__________________________________________________________________________________________________
batch_normalization_62 (BatchNo (None, 35, 35, 32)   96          conv2d_62[0][0]                  
__________________________________________________________________________________________________
batch_normalization_65 (BatchNo (None, 35, 35, 48)   144         conv2d_65[0][0]                  
__________________________________________________________________________________________________
activation_62 (Activation)      (None, 35, 35, 32)   0           batch_normalization_62[0][0]     
__________________________________________________________________________________________________
activation_65 (Activation)      (None, 35, 35, 48)   0           batch_normalization_65[0][0]     
__________________________________________________________________________________________________
conv2d_61 (Conv2D)              (None, 35, 35, 32)   10240       block35_8_ac[0][0]               
__________________________________________________________________________________________________
conv2d_63 (Conv2D)              (None, 35, 35, 32)   9216        activation_62[0][0]              
__________________________________________________________________________________________________
conv2d_66 (Conv2D)              (None, 35, 35, 64)   27648       activation_65[0][0]              
__________________________________________________________________________________________________
batch_normalization_61 (BatchNo (None, 35, 35, 32)   96          conv2d_61[0][0]                  
__________________________________________________________________________________________________
batch_normalization_63 (BatchNo (None, 35, 35, 32)   96          conv2d_63[0][0]                  
__________________________________________________________________________________________________
batch_normalization_66 (BatchNo (None, 35, 35, 64)   192         conv2d_66[0][0]                  
__________________________________________________________________________________________________
activation_61 (Activation)      (None, 35, 35, 32)   0           batch_normalization_61[0][0]     
__________________________________________________________________________________________________
activation_63 (Activation)      (None, 35, 35, 32)   0           batch_normalization_63[0][0]     
__________________________________________________________________________________________________
activation_66 (Activation)      (None, 35, 35, 64)   0           batch_normalization_66[0][0]     
__________________________________________________________________________________________________
block35_9_mixed (Concatenate)   (None, 35, 35, 128)  0           activation_61[0][0]              
                                                                 activation_63[0][0]              
                                                                 activation_66[0][0]              
__________________________________________________________________________________________________
block35_9_conv (Conv2D)         (None, 35, 35, 320)  41280       block35_9_mixed[0][0]            
__________________________________________________________________________________________________
block35_9 (Lambda)              (None, 35, 35, 320)  0           block35_8_ac[0][0]               
                                                                 block35_9_conv[0][0]             
__________________________________________________________________________________________________
block35_9_ac (Activation)       (None, 35, 35, 320)  0           block35_9[0][0]                  
__________________________________________________________________________________________________
conv2d_70 (Conv2D)              (None, 35, 35, 32)   10240       block35_9_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_70 (BatchNo (None, 35, 35, 32)   96          conv2d_70[0][0]                  
__________________________________________________________________________________________________
activation_70 (Activation)      (None, 35, 35, 32)   0           batch_normalization_70[0][0]     
__________________________________________________________________________________________________
conv2d_68 (Conv2D)              (None, 35, 35, 32)   10240       block35_9_ac[0][0]               
__________________________________________________________________________________________________
conv2d_71 (Conv2D)              (None, 35, 35, 48)   13824       activation_70[0][0]              
__________________________________________________________________________________________________
batch_normalization_68 (BatchNo (None, 35, 35, 32)   96          conv2d_68[0][0]                  
__________________________________________________________________________________________________
batch_normalization_71 (BatchNo (None, 35, 35, 48)   144         conv2d_71[0][0]                  
__________________________________________________________________________________________________
activation_68 (Activation)      (None, 35, 35, 32)   0           batch_normalization_68[0][0]     
__________________________________________________________________________________________________
activation_71 (Activation)      (None, 35, 35, 48)   0           batch_normalization_71[0][0]     
__________________________________________________________________________________________________
conv2d_67 (Conv2D)              (None, 35, 35, 32)   10240       block35_9_ac[0][0]               
__________________________________________________________________________________________________
conv2d_69 (Conv2D)              (None, 35, 35, 32)   9216        activation_68[0][0]              
__________________________________________________________________________________________________
conv2d_72 (Conv2D)              (None, 35, 35, 64)   27648       activation_71[0][0]              
__________________________________________________________________________________________________
batch_normalization_67 (BatchNo (None, 35, 35, 32)   96          conv2d_67[0][0]                  
__________________________________________________________________________________________________
batch_normalization_69 (BatchNo (None, 35, 35, 32)   96          conv2d_69[0][0]                  
__________________________________________________________________________________________________
batch_normalization_72 (BatchNo (None, 35, 35, 64)   192         conv2d_72[0][0]                  
__________________________________________________________________________________________________
activation_67 (Activation)      (None, 35, 35, 32)   0           batch_normalization_67[0][0]     
__________________________________________________________________________________________________
activation_69 (Activation)      (None, 35, 35, 32)   0           batch_normalization_69[0][0]     
__________________________________________________________________________________________________
activation_72 (Activation)      (None, 35, 35, 64)   0           batch_normalization_72[0][0]     
__________________________________________________________________________________________________
block35_10_mixed (Concatenate)  (None, 35, 35, 128)  0           activation_67[0][0]              
                                                                 activation_69[0][0]              
                                                                 activation_72[0][0]              
__________________________________________________________________________________________________
block35_10_conv (Conv2D)        (None, 35, 35, 320)  41280       block35_10_mixed[0][0]           
__________________________________________________________________________________________________
block35_10 (Lambda)             (None, 35, 35, 320)  0           block35_9_ac[0][0]               
                                                                 block35_10_conv[0][0]            
__________________________________________________________________________________________________
block35_10_ac (Activation)      (None, 35, 35, 320)  0           block35_10[0][0]                 
__________________________________________________________________________________________________
conv2d_74 (Conv2D)              (None, 35, 35, 256)  81920       block35_10_ac[0][0]              
__________________________________________________________________________________________________
batch_normalization_74 (BatchNo (None, 35, 35, 256)  768         conv2d_74[0][0]                  
__________________________________________________________________________________________________
activation_74 (Activation)      (None, 35, 35, 256)  0           batch_normalization_74[0][0]     
__________________________________________________________________________________________________
conv2d_75 (Conv2D)              (None, 35, 35, 256)  589824      activation_74[0][0]              
__________________________________________________________________________________________________
batch_normalization_75 (BatchNo (None, 35, 35, 256)  768         conv2d_75[0][0]                  
__________________________________________________________________________________________________
activation_75 (Activation)      (None, 35, 35, 256)  0           batch_normalization_75[0][0]     
__________________________________________________________________________________________________
conv2d_73 (Conv2D)              (None, 17, 17, 384)  1105920     block35_10_ac[0][0]              
__________________________________________________________________________________________________
conv2d_76 (Conv2D)              (None, 17, 17, 384)  884736      activation_75[0][0]              
__________________________________________________________________________________________________
batch_normalization_73 (BatchNo (None, 17, 17, 384)  1152        conv2d_73[0][0]                  
__________________________________________________________________________________________________
batch_normalization_76 (BatchNo (None, 17, 17, 384)  1152        conv2d_76[0][0]                  
__________________________________________________________________________________________________
activation_73 (Activation)      (None, 17, 17, 384)  0           batch_normalization_73[0][0]     
__________________________________________________________________________________________________
activation_76 (Activation)      (None, 17, 17, 384)  0           batch_normalization_76[0][0]     
__________________________________________________________________________________________________
max_pooling2d_3 (MaxPooling2D)  (None, 17, 17, 320)  0           block35_10_ac[0][0]              
__________________________________________________________________________________________________
mixed_6a (Concatenate)          (None, 17, 17, 1088) 0           activation_73[0][0]              
                                                                 activation_76[0][0]              
                                                                 max_pooling2d_3[0][0]            
__________________________________________________________________________________________________
conv2d_78 (Conv2D)              (None, 17, 17, 128)  139264      mixed_6a[0][0]                   
__________________________________________________________________________________________________
batch_normalization_78 (BatchNo (None, 17, 17, 128)  384         conv2d_78[0][0]                  
__________________________________________________________________________________________________
activation_78 (Activation)      (None, 17, 17, 128)  0           batch_normalization_78[0][0]     
__________________________________________________________________________________________________
conv2d_79 (Conv2D)              (None, 17, 17, 160)  143360      activation_78[0][0]              
__________________________________________________________________________________________________
batch_normalization_79 (BatchNo (None, 17, 17, 160)  480         conv2d_79[0][0]                  
__________________________________________________________________________________________________
activation_79 (Activation)      (None, 17, 17, 160)  0           batch_normalization_79[0][0]     
__________________________________________________________________________________________________
conv2d_77 (Conv2D)              (None, 17, 17, 192)  208896      mixed_6a[0][0]                   
__________________________________________________________________________________________________
conv2d_80 (Conv2D)              (None, 17, 17, 192)  215040      activation_79[0][0]              
__________________________________________________________________________________________________
batch_normalization_77 (BatchNo (None, 17, 17, 192)  576         conv2d_77[0][0]                  
__________________________________________________________________________________________________
batch_normalization_80 (BatchNo (None, 17, 17, 192)  576         conv2d_80[0][0]                  
__________________________________________________________________________________________________
activation_77 (Activation)      (None, 17, 17, 192)  0           batch_normalization_77[0][0]     
__________________________________________________________________________________________________
activation_80 (Activation)      (None, 17, 17, 192)  0           batch_normalization_80[0][0]     
__________________________________________________________________________________________________
block17_1_mixed (Concatenate)   (None, 17, 17, 384)  0           activation_77[0][0]              
                                                                 activation_80[0][0]              
__________________________________________________________________________________________________
block17_1_conv (Conv2D)         (None, 17, 17, 1088) 418880      block17_1_mixed[0][0]            
__________________________________________________________________________________________________
block17_1 (Lambda)              (None, 17, 17, 1088) 0           mixed_6a[0][0]                   
                                                                 block17_1_conv[0][0]             
__________________________________________________________________________________________________
block17_1_ac (Activation)       (None, 17, 17, 1088) 0           block17_1[0][0]                  
__________________________________________________________________________________________________
conv2d_82 (Conv2D)              (None, 17, 17, 128)  139264      block17_1_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_82 (BatchNo (None, 17, 17, 128)  384         conv2d_82[0][0]                  
__________________________________________________________________________________________________
activation_82 (Activation)      (None, 17, 17, 128)  0           batch_normalization_82[0][0]     
__________________________________________________________________________________________________
conv2d_83 (Conv2D)              (None, 17, 17, 160)  143360      activation_82[0][0]              
__________________________________________________________________________________________________
batch_normalization_83 (BatchNo (None, 17, 17, 160)  480         conv2d_83[0][0]                  
__________________________________________________________________________________________________
activation_83 (Activation)      (None, 17, 17, 160)  0           batch_normalization_83[0][0]     
__________________________________________________________________________________________________
conv2d_81 (Conv2D)              (None, 17, 17, 192)  208896      block17_1_ac[0][0]               
__________________________________________________________________________________________________
conv2d_84 (Conv2D)              (None, 17, 17, 192)  215040      activation_83[0][0]              
__________________________________________________________________________________________________
batch_normalization_81 (BatchNo (None, 17, 17, 192)  576         conv2d_81[0][0]                  
__________________________________________________________________________________________________
batch_normalization_84 (BatchNo (None, 17, 17, 192)  576         conv2d_84[0][0]                  
__________________________________________________________________________________________________
activation_81 (Activation)      (None, 17, 17, 192)  0           batch_normalization_81[0][0]     
__________________________________________________________________________________________________
activation_84 (Activation)      (None, 17, 17, 192)  0           batch_normalization_84[0][0]     
__________________________________________________________________________________________________
block17_2_mixed (Concatenate)   (None, 17, 17, 384)  0           activation_81[0][0]              
                                                                 activation_84[0][0]              
__________________________________________________________________________________________________
block17_2_conv (Conv2D)         (None, 17, 17, 1088) 418880      block17_2_mixed[0][0]            
__________________________________________________________________________________________________
block17_2 (Lambda)              (None, 17, 17, 1088) 0           block17_1_ac[0][0]               
                                                                 block17_2_conv[0][0]             
__________________________________________________________________________________________________
block17_2_ac (Activation)       (None, 17, 17, 1088) 0           block17_2[0][0]                  
__________________________________________________________________________________________________
conv2d_86 (Conv2D)              (None, 17, 17, 128)  139264      block17_2_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_86 (BatchNo (None, 17, 17, 128)  384         conv2d_86[0][0]                  
__________________________________________________________________________________________________
activation_86 (Activation)      (None, 17, 17, 128)  0           batch_normalization_86[0][0]     
__________________________________________________________________________________________________
conv2d_87 (Conv2D)              (None, 17, 17, 160)  143360      activation_86[0][0]              
__________________________________________________________________________________________________
batch_normalization_87 (BatchNo (None, 17, 17, 160)  480         conv2d_87[0][0]                  
__________________________________________________________________________________________________
activation_87 (Activation)      (None, 17, 17, 160)  0           batch_normalization_87[0][0]     
__________________________________________________________________________________________________
conv2d_85 (Conv2D)              (None, 17, 17, 192)  208896      block17_2_ac[0][0]               
__________________________________________________________________________________________________
conv2d_88 (Conv2D)              (None, 17, 17, 192)  215040      activation_87[0][0]              
__________________________________________________________________________________________________
batch_normalization_85 (BatchNo (None, 17, 17, 192)  576         conv2d_85[0][0]                  
__________________________________________________________________________________________________
batch_normalization_88 (BatchNo (None, 17, 17, 192)  576         conv2d_88[0][0]                  
__________________________________________________________________________________________________
activation_85 (Activation)      (None, 17, 17, 192)  0           batch_normalization_85[0][0]     
__________________________________________________________________________________________________
activation_88 (Activation)      (None, 17, 17, 192)  0           batch_normalization_88[0][0]     
__________________________________________________________________________________________________
block17_3_mixed (Concatenate)   (None, 17, 17, 384)  0           activation_85[0][0]              
                                                                 activation_88[0][0]              
__________________________________________________________________________________________________
block17_3_conv (Conv2D)         (None, 17, 17, 1088) 418880      block17_3_mixed[0][0]            
__________________________________________________________________________________________________
block17_3 (Lambda)              (None, 17, 17, 1088) 0           block17_2_ac[0][0]               
                                                                 block17_3_conv[0][0]             
__________________________________________________________________________________________________
block17_3_ac (Activation)       (None, 17, 17, 1088) 0           block17_3[0][0]                  
__________________________________________________________________________________________________
conv2d_90 (Conv2D)              (None, 17, 17, 128)  139264      block17_3_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_90 (BatchNo (None, 17, 17, 128)  384         conv2d_90[0][0]                  
__________________________________________________________________________________________________
activation_90 (Activation)      (None, 17, 17, 128)  0           batch_normalization_90[0][0]     
__________________________________________________________________________________________________
conv2d_91 (Conv2D)              (None, 17, 17, 160)  143360      activation_90[0][0]              
__________________________________________________________________________________________________
batch_normalization_91 (BatchNo (None, 17, 17, 160)  480         conv2d_91[0][0]                  
__________________________________________________________________________________________________
activation_91 (Activation)      (None, 17, 17, 160)  0           batch_normalization_91[0][0]     
__________________________________________________________________________________________________
conv2d_89 (Conv2D)              (None, 17, 17, 192)  208896      block17_3_ac[0][0]               
__________________________________________________________________________________________________
conv2d_92 (Conv2D)              (None, 17, 17, 192)  215040      activation_91[0][0]              
__________________________________________________________________________________________________
batch_normalization_89 (BatchNo (None, 17, 17, 192)  576         conv2d_89[0][0]                  
__________________________________________________________________________________________________
batch_normalization_92 (BatchNo (None, 17, 17, 192)  576         conv2d_92[0][0]                  
__________________________________________________________________________________________________
activation_89 (Activation)      (None, 17, 17, 192)  0           batch_normalization_89[0][0]     
__________________________________________________________________________________________________
activation_92 (Activation)      (None, 17, 17, 192)  0           batch_normalization_92[0][0]     
__________________________________________________________________________________________________
block17_4_mixed (Concatenate)   (None, 17, 17, 384)  0           activation_89[0][0]              
                                                                 activation_92[0][0]              
__________________________________________________________________________________________________
block17_4_conv (Conv2D)         (None, 17, 17, 1088) 418880      block17_4_mixed[0][0]            
__________________________________________________________________________________________________
block17_4 (Lambda)              (None, 17, 17, 1088) 0           block17_3_ac[0][0]               
                                                                 block17_4_conv[0][0]             
__________________________________________________________________________________________________
block17_4_ac (Activation)       (None, 17, 17, 1088) 0           block17_4[0][0]                  
__________________________________________________________________________________________________
conv2d_94 (Conv2D)              (None, 17, 17, 128)  139264      block17_4_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_94 (BatchNo (None, 17, 17, 128)  384         conv2d_94[0][0]                  
__________________________________________________________________________________________________
activation_94 (Activation)      (None, 17, 17, 128)  0           batch_normalization_94[0][0]     
__________________________________________________________________________________________________
conv2d_95 (Conv2D)              (None, 17, 17, 160)  143360      activation_94[0][0]              
__________________________________________________________________________________________________
batch_normalization_95 (BatchNo (None, 17, 17, 160)  480         conv2d_95[0][0]                  
__________________________________________________________________________________________________
activation_95 (Activation)      (None, 17, 17, 160)  0           batch_normalization_95[0][0]     
__________________________________________________________________________________________________
conv2d_93 (Conv2D)              (None, 17, 17, 192)  208896      block17_4_ac[0][0]               
__________________________________________________________________________________________________
conv2d_96 (Conv2D)              (None, 17, 17, 192)  215040      activation_95[0][0]              
__________________________________________________________________________________________________
batch_normalization_93 (BatchNo (None, 17, 17, 192)  576         conv2d_93[0][0]                  
__________________________________________________________________________________________________
batch_normalization_96 (BatchNo (None, 17, 17, 192)  576         conv2d_96[0][0]                  
__________________________________________________________________________________________________
activation_93 (Activation)      (None, 17, 17, 192)  0           batch_normalization_93[0][0]     
__________________________________________________________________________________________________
activation_96 (Activation)      (None, 17, 17, 192)  0           batch_normalization_96[0][0]     
__________________________________________________________________________________________________
block17_5_mixed (Concatenate)   (None, 17, 17, 384)  0           activation_93[0][0]              
                                                                 activation_96[0][0]              
__________________________________________________________________________________________________
block17_5_conv (Conv2D)         (None, 17, 17, 1088) 418880      block17_5_mixed[0][0]            
__________________________________________________________________________________________________
block17_5 (Lambda)              (None, 17, 17, 1088) 0           block17_4_ac[0][0]               
                                                                 block17_5_conv[0][0]             
__________________________________________________________________________________________________
block17_5_ac (Activation)       (None, 17, 17, 1088) 0           block17_5[0][0]                  
__________________________________________________________________________________________________
conv2d_98 (Conv2D)              (None, 17, 17, 128)  139264      block17_5_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_98 (BatchNo (None, 17, 17, 128)  384         conv2d_98[0][0]                  
__________________________________________________________________________________________________
activation_98 (Activation)      (None, 17, 17, 128)  0           batch_normalization_98[0][0]     
__________________________________________________________________________________________________
conv2d_99 (Conv2D)              (None, 17, 17, 160)  143360      activation_98[0][0]              
__________________________________________________________________________________________________
batch_normalization_99 (BatchNo (None, 17, 17, 160)  480         conv2d_99[0][0]                  
__________________________________________________________________________________________________
activation_99 (Activation)      (None, 17, 17, 160)  0           batch_normalization_99[0][0]     
__________________________________________________________________________________________________
conv2d_97 (Conv2D)              (None, 17, 17, 192)  208896      block17_5_ac[0][0]               
__________________________________________________________________________________________________
conv2d_100 (Conv2D)             (None, 17, 17, 192)  215040      activation_99[0][0]              
__________________________________________________________________________________________________
batch_normalization_97 (BatchNo (None, 17, 17, 192)  576         conv2d_97[0][0]                  
__________________________________________________________________________________________________
batch_normalization_100 (BatchN (None, 17, 17, 192)  576         conv2d_100[0][0]                 
__________________________________________________________________________________________________
activation_97 (Activation)      (None, 17, 17, 192)  0           batch_normalization_97[0][0]     
__________________________________________________________________________________________________
activation_100 (Activation)     (None, 17, 17, 192)  0           batch_normalization_100[0][0]    
__________________________________________________________________________________________________
block17_6_mixed (Concatenate)   (None, 17, 17, 384)  0           activation_97[0][0]              
                                                                 activation_100[0][0]             
__________________________________________________________________________________________________
block17_6_conv (Conv2D)         (None, 17, 17, 1088) 418880      block17_6_mixed[0][0]            
__________________________________________________________________________________________________
block17_6 (Lambda)              (None, 17, 17, 1088) 0           block17_5_ac[0][0]               
                                                                 block17_6_conv[0][0]             
__________________________________________________________________________________________________
block17_6_ac (Activation)       (None, 17, 17, 1088) 0           block17_6[0][0]                  
__________________________________________________________________________________________________
conv2d_102 (Conv2D)             (None, 17, 17, 128)  139264      block17_6_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_102 (BatchN (None, 17, 17, 128)  384         conv2d_102[0][0]                 
__________________________________________________________________________________________________
activation_102 (Activation)     (None, 17, 17, 128)  0           batch_normalization_102[0][0]    
__________________________________________________________________________________________________
conv2d_103 (Conv2D)             (None, 17, 17, 160)  143360      activation_102[0][0]             
__________________________________________________________________________________________________
batch_normalization_103 (BatchN (None, 17, 17, 160)  480         conv2d_103[0][0]                 
__________________________________________________________________________________________________
activation_103 (Activation)     (None, 17, 17, 160)  0           batch_normalization_103[0][0]    
__________________________________________________________________________________________________
conv2d_101 (Conv2D)             (None, 17, 17, 192)  208896      block17_6_ac[0][0]               
__________________________________________________________________________________________________
conv2d_104 (Conv2D)             (None, 17, 17, 192)  215040      activation_103[0][0]             
__________________________________________________________________________________________________
batch_normalization_101 (BatchN (None, 17, 17, 192)  576         conv2d_101[0][0]                 
__________________________________________________________________________________________________
batch_normalization_104 (BatchN (None, 17, 17, 192)  576         conv2d_104[0][0]                 
__________________________________________________________________________________________________
activation_101 (Activation)     (None, 17, 17, 192)  0           batch_normalization_101[0][0]    
__________________________________________________________________________________________________
activation_104 (Activation)     (None, 17, 17, 192)  0           batch_normalization_104[0][0]    
__________________________________________________________________________________________________
block17_7_mixed (Concatenate)   (None, 17, 17, 384)  0           activation_101[0][0]             
                                                                 activation_104[0][0]             
__________________________________________________________________________________________________
block17_7_conv (Conv2D)         (None, 17, 17, 1088) 418880      block17_7_mixed[0][0]            
__________________________________________________________________________________________________
block17_7 (Lambda)              (None, 17, 17, 1088) 0           block17_6_ac[0][0]               
                                                                 block17_7_conv[0][0]             
__________________________________________________________________________________________________
block17_7_ac (Activation)       (None, 17, 17, 1088) 0           block17_7[0][0]                  
__________________________________________________________________________________________________
conv2d_106 (Conv2D)             (None, 17, 17, 128)  139264      block17_7_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_106 (BatchN (None, 17, 17, 128)  384         conv2d_106[0][0]                 
__________________________________________________________________________________________________
activation_106 (Activation)     (None, 17, 17, 128)  0           batch_normalization_106[0][0]    
__________________________________________________________________________________________________
conv2d_107 (Conv2D)             (None, 17, 17, 160)  143360      activation_106[0][0]             
__________________________________________________________________________________________________
batch_normalization_107 (BatchN (None, 17, 17, 160)  480         conv2d_107[0][0]                 
__________________________________________________________________________________________________
activation_107 (Activation)     (None, 17, 17, 160)  0           batch_normalization_107[0][0]    
__________________________________________________________________________________________________
conv2d_105 (Conv2D)             (None, 17, 17, 192)  208896      block17_7_ac[0][0]               
__________________________________________________________________________________________________
conv2d_108 (Conv2D)             (None, 17, 17, 192)  215040      activation_107[0][0]             
__________________________________________________________________________________________________
batch_normalization_105 (BatchN (None, 17, 17, 192)  576         conv2d_105[0][0]                 
__________________________________________________________________________________________________
batch_normalization_108 (BatchN (None, 17, 17, 192)  576         conv2d_108[0][0]                 
__________________________________________________________________________________________________
activation_105 (Activation)     (None, 17, 17, 192)  0           batch_normalization_105[0][0]    
__________________________________________________________________________________________________
activation_108 (Activation)     (None, 17, 17, 192)  0           batch_normalization_108[0][0]    
__________________________________________________________________________________________________
block17_8_mixed (Concatenate)   (None, 17, 17, 384)  0           activation_105[0][0]             
                                                                 activation_108[0][0]             
__________________________________________________________________________________________________
block17_8_conv (Conv2D)         (None, 17, 17, 1088) 418880      block17_8_mixed[0][0]            
__________________________________________________________________________________________________
block17_8 (Lambda)              (None, 17, 17, 1088) 0           block17_7_ac[0][0]               
                                                                 block17_8_conv[0][0]             
__________________________________________________________________________________________________
block17_8_ac (Activation)       (None, 17, 17, 1088) 0           block17_8[0][0]                  
__________________________________________________________________________________________________
conv2d_110 (Conv2D)             (None, 17, 17, 128)  139264      block17_8_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_110 (BatchN (None, 17, 17, 128)  384         conv2d_110[0][0]                 
__________________________________________________________________________________________________
activation_110 (Activation)     (None, 17, 17, 128)  0           batch_normalization_110[0][0]    
__________________________________________________________________________________________________
conv2d_111 (Conv2D)             (None, 17, 17, 160)  143360      activation_110[0][0]             
__________________________________________________________________________________________________
batch_normalization_111 (BatchN (None, 17, 17, 160)  480         conv2d_111[0][0]                 
__________________________________________________________________________________________________
activation_111 (Activation)     (None, 17, 17, 160)  0           batch_normalization_111[0][0]    
__________________________________________________________________________________________________
conv2d_109 (Conv2D)             (None, 17, 17, 192)  208896      block17_8_ac[0][0]               
__________________________________________________________________________________________________
conv2d_112 (Conv2D)             (None, 17, 17, 192)  215040      activation_111[0][0]             
__________________________________________________________________________________________________
batch_normalization_109 (BatchN (None, 17, 17, 192)  576         conv2d_109[0][0]                 
__________________________________________________________________________________________________
batch_normalization_112 (BatchN (None, 17, 17, 192)  576         conv2d_112[0][0]                 
__________________________________________________________________________________________________
activation_109 (Activation)     (None, 17, 17, 192)  0           batch_normalization_109[0][0]    
__________________________________________________________________________________________________
activation_112 (Activation)     (None, 17, 17, 192)  0           batch_normalization_112[0][0]    
__________________________________________________________________________________________________
block17_9_mixed (Concatenate)   (None, 17, 17, 384)  0           activation_109[0][0]             
                                                                 activation_112[0][0]             
__________________________________________________________________________________________________
block17_9_conv (Conv2D)         (None, 17, 17, 1088) 418880      block17_9_mixed[0][0]            
__________________________________________________________________________________________________
block17_9 (Lambda)              (None, 17, 17, 1088) 0           block17_8_ac[0][0]               
                                                                 block17_9_conv[0][0]             
__________________________________________________________________________________________________
block17_9_ac (Activation)       (None, 17, 17, 1088) 0           block17_9[0][0]                  
__________________________________________________________________________________________________
conv2d_114 (Conv2D)             (None, 17, 17, 128)  139264      block17_9_ac[0][0]               
__________________________________________________________________________________________________
batch_normalization_114 (BatchN (None, 17, 17, 128)  384         conv2d_114[0][0]                 
__________________________________________________________________________________________________
activation_114 (Activation)     (None, 17, 17, 128)  0           batch_normalization_114[0][0]    
__________________________________________________________________________________________________
conv2d_115 (Conv2D)             (None, 17, 17, 160)  143360      activation_114[0][0]             
__________________________________________________________________________________________________
batch_normalization_115 (BatchN (None, 17, 17, 160)  480         conv2d_115[0][0]                 
__________________________________________________________________________________________________
activation_115 (Activation)     (None, 17, 17, 160)  0           batch_normalization_115[0][0]    
__________________________________________________________________________________________________
conv2d_113 (Conv2D)             (None, 17, 17, 192)  208896      block17_9_ac[0][0]               
__________________________________________________________________________________________________
conv2d_116 (Conv2D)             (None, 17, 17, 192)  215040      activation_115[0][0]             
__________________________________________________________________________________________________
batch_normalization_113 (BatchN (None, 17, 17, 192)  576         conv2d_113[0][0]                 
__________________________________________________________________________________________________
batch_normalization_116 (BatchN (None, 17, 17, 192)  576         conv2d_116[0][0]                 
__________________________________________________________________________________________________
activation_113 (Activation)     (None, 17, 17, 192)  0           batch_normalization_113[0][0]    
__________________________________________________________________________________________________
activation_116 (Activation)     (None, 17, 17, 192)  0           batch_normalization_116[0][0]    
__________________________________________________________________________________________________
block17_10_mixed (Concatenate)  (None, 17, 17, 384)  0           activation_113[0][0]             
                                                                 activation_116[0][0]             
__________________________________________________________________________________________________
block17_10_conv (Conv2D)        (None, 17, 17, 1088) 418880      block17_10_mixed[0][0]           
__________________________________________________________________________________________________
block17_10 (Lambda)             (None, 17, 17, 1088) 0           block17_9_ac[0][0]               
                                                                 block17_10_conv[0][0]            
__________________________________________________________________________________________________
block17_10_ac (Activation)      (None, 17, 17, 1088) 0           block17_10[0][0]                 
__________________________________________________________________________________________________
conv2d_118 (Conv2D)             (None, 17, 17, 128)  139264      block17_10_ac[0][0]              
__________________________________________________________________________________________________
batch_normalization_118 (BatchN (None, 17, 17, 128)  384         conv2d_118[0][0]                 
__________________________________________________________________________________________________
activation_118 (Activation)     (None, 17, 17, 128)  0           batch_normalization_118[0][0]    
__________________________________________________________________________________________________
conv2d_119 (Conv2D)             (None, 17, 17, 160)  143360      activation_118[0][0]             
__________________________________________________________________________________________________
batch_normalization_119 (BatchN (None, 17, 17, 160)  480         conv2d_119[0][0]                 
__________________________________________________________________________________________________
activation_119 (Activation)     (None, 17, 17, 160)  0           batch_normalization_119[0][0]    
__________________________________________________________________________________________________
conv2d_117 (Conv2D)             (None, 17, 17, 192)  208896      block17_10_ac[0][0]              
__________________________________________________________________________________________________
conv2d_120 (Conv2D)             (None, 17, 17, 192)  215040      activation_119[0][0]             
__________________________________________________________________________________________________
batch_normalization_117 (BatchN (None, 17, 17, 192)  576         conv2d_117[0][0]                 
__________________________________________________________________________________________________
batch_normalization_120 (BatchN (None, 17, 17, 192)  576         conv2d_120[0][0]                 
__________________________________________________________________________________________________
activation_117 (Activation)     (None, 17, 17, 192)  0           batch_normalization_117[0][0]    
__________________________________________________________________________________________________
activation_120 (Activation)     (None, 17, 17, 192)  0           batch_normalization_120[0][0]    
__________________________________________________________________________________________________
block17_11_mixed (Concatenate)  (None, 17, 17, 384)  0           activation_117[0][0]             
                                                                 activation_120[0][0]             
__________________________________________________________________________________________________
block17_11_conv (Conv2D)        (None, 17, 17, 1088) 418880      block17_11_mixed[0][0]           
__________________________________________________________________________________________________
block17_11 (Lambda)             (None, 17, 17, 1088) 0           block17_10_ac[0][0]              
                                                                 block17_11_conv[0][0]            
__________________________________________________________________________________________________
block17_11_ac (Activation)      (None, 17, 17, 1088) 0           block17_11[0][0]                 
__________________________________________________________________________________________________
conv2d_122 (Conv2D)             (None, 17, 17, 128)  139264      block17_11_ac[0][0]              
__________________________________________________________________________________________________
batch_normalization_122 (BatchN (None, 17, 17, 128)  384         conv2d_122[0][0]                 
__________________________________________________________________________________________________
activation_122 (Activation)     (None, 17, 17, 128)  0           batch_normalization_122[0][0]    
__________________________________________________________________________________________________
conv2d_123 (Conv2D)             (None, 17, 17, 160)  143360      activation_122[0][0]             
__________________________________________________________________________________________________
batch_normalization_123 (BatchN (None, 17, 17, 160)  480         conv2d_123[0][0]                 
__________________________________________________________________________________________________
activation_123 (Activation)     (None, 17, 17, 160)  0           batch_normalization_123[0][0]    
__________________________________________________________________________________________________
conv2d_121 (Conv2D)             (None, 17, 17, 192)  208896      block17_11_ac[0][0]              
__________________________________________________________________________________________________
conv2d_124 (Conv2D)             (None, 17, 17, 192)  215040      activation_123[0][0]             
__________________________________________________________________________________________________
batch_normalization_121 (BatchN (None, 17, 17, 192)  576         conv2d_121[0][0]                 
__________________________________________________________________________________________________
batch_normalization_124 (BatchN (None, 17, 17, 192)  576         conv2d_124[0][0]                 
__________________________________________________________________________________________________
activation_121 (Activation)     (None, 17, 17, 192)  0           batch_normalization_121[0][0]    
__________________________________________________________________________________________________
activation_124 (Activation)     (None, 17, 17, 192)  0           batch_normalization_124[0][0]    
__________________________________________________________________________________________________
block17_12_mixed (Concatenate)  (None, 17, 17, 384)  0           activation_121[0][0]             
                                                                 activation_124[0][0]             
__________________________________________________________________________________________________
block17_12_conv (Conv2D)        (None, 17, 17, 1088) 418880      block17_12_mixed[0][0]           
__________________________________________________________________________________________________
block17_12 (Lambda)             (None, 17, 17, 1088) 0           block17_11_ac[0][0]              
                                                                 block17_12_conv[0][0]            
__________________________________________________________________________________________________
block17_12_ac (Activation)      (None, 17, 17, 1088) 0           block17_12[0][0]                 
__________________________________________________________________________________________________
conv2d_126 (Conv2D)             (None, 17, 17, 128)  139264      block17_12_ac[0][0]              
__________________________________________________________________________________________________
batch_normalization_126 (BatchN (None, 17, 17, 128)  384         conv2d_126[0][0]                 
__________________________________________________________________________________________________
activation_126 (Activation)     (None, 17, 17, 128)  0           batch_normalization_126[0][0]    
__________________________________________________________________________________________________
conv2d_127 (Conv2D)             (None, 17, 17, 160)  143360      activation_126[0][0]             
__________________________________________________________________________________________________
batch_normalization_127 (BatchN (None, 17, 17, 160)  480         conv2d_127[0][0]                 
__________________________________________________________________________________________________
activation_127 (Activation)     (None, 17, 17, 160)  0           batch_normalization_127[0][0]    
__________________________________________________________________________________________________
conv2d_125 (Conv2D)             (None, 17, 17, 192)  208896      block17_12_ac[0][0]              
__________________________________________________________________________________________________
conv2d_128 (Conv2D)             (None, 17, 17, 192)  215040      activation_127[0][0]             
__________________________________________________________________________________________________
batch_normalization_125 (BatchN (None, 17, 17, 192)  576         conv2d_125[0][0]                 
__________________________________________________________________________________________________
batch_normalization_128 (BatchN (None, 17, 17, 192)  576         conv2d_128[0][0]                 
__________________________________________________________________________________________________
activation_125 (Activation)     (None, 17, 17, 192)  0           batch_normalization_125[0][0]    
__________________________________________________________________________________________________
activation_128 (Activation)     (None, 17, 17, 192)  0           batch_normalization_128[0][0]    
__________________________________________________________________________________________________
block17_13_mixed (Concatenate)  (None, 17, 17, 384)  0           activation_125[0][0]             
                                                                 activation_128[0][0]             
__________________________________________________________________________________________________
block17_13_conv (Conv2D)        (None, 17, 17, 1088) 418880      block17_13_mixed[0][0]           
__________________________________________________________________________________________________
block17_13 (Lambda)             (None, 17, 17, 1088) 0           block17_12_ac[0][0]              
                                                                 block17_13_conv[0][0]            
__________________________________________________________________________________________________
block17_13_ac (Activation)      (None, 17, 17, 1088) 0           block17_13[0][0]                 
__________________________________________________________________________________________________
conv2d_130 (Conv2D)             (None, 17, 17, 128)  139264      block17_13_ac[0][0]              
__________________________________________________________________________________________________
batch_normalization_130 (BatchN (None, 17, 17, 128)  384         conv2d_130[0][0]                 
__________________________________________________________________________________________________
activation_130 (Activation)     (None, 17, 17, 128)  0           batch_normalization_130[0][0]    
__________________________________________________________________________________________________
conv2d_131 (Conv2D)             (None, 17, 17, 160)  143360      activation_130[0][0]             
__________________________________________________________________________________________________
batch_normalization_131 (BatchN (None, 17, 17, 160)  480         conv2d_131[0][0]                 
__________________________________________________________________________________________________
activation_131 (Activation)     (None, 17, 17, 160)  0           batch_normalization_131[0][0]    
__________________________________________________________________________________________________
conv2d_129 (Conv2D)             (None, 17, 17, 192)  208896      block17_13_ac[0][0]              
__________________________________________________________________________________________________
conv2d_132 (Conv2D)             (None, 17, 17, 192)  215040      activation_131[0][0]             
__________________________________________________________________________________________________
batch_normalization_129 (BatchN (None, 17, 17, 192)  576         conv2d_129[0][0]                 
__________________________________________________________________________________________________
batch_normalization_132 (BatchN (None, 17, 17, 192)  576         conv2d_132[0][0]                 
__________________________________________________________________________________________________
activation_129 (Activation)     (None, 17, 17, 192)  0           batch_normalization_129[0][0]    
__________________________________________________________________________________________________
activation_132 (Activation)     (None, 17, 17, 192)  0           batch_normalization_132[0][0]    
__________________________________________________________________________________________________
block17_14_mixed (Concatenate)  (None, 17, 17, 384)  0           activation_129[0][0]             
                                                                 activation_132[0][0]             
__________________________________________________________________________________________________
block17_14_conv (Conv2D)        (None, 17, 17, 1088) 418880      block17_14_mixed[0][0]           
__________________________________________________________________________________________________
block17_14 (Lambda)             (None, 17, 17, 1088) 0           block17_13_ac[0][0]              
                                                                 block17_14_conv[0][0]            
__________________________________________________________________________________________________
block17_14_ac (Activation)      (None, 17, 17, 1088) 0           block17_14[0][0]                 
__________________________________________________________________________________________________
conv2d_134 (Conv2D)             (None, 17, 17, 128)  139264      block17_14_ac[0][0]              
__________________________________________________________________________________________________
batch_normalization_134 (BatchN (None, 17, 17, 128)  384         conv2d_134[0][0]                 
__________________________________________________________________________________________________
activation_134 (Activation)     (None, 17, 17, 128)  0           batch_normalization_134[0][0]    
__________________________________________________________________________________________________
conv2d_135 (Conv2D)             (None, 17, 17, 160)  143360      activation_134[0][0]             
__________________________________________________________________________________________________
batch_normalization_135 (BatchN (None, 17, 17, 160)  480         conv2d_135[0][0]                 
__________________________________________________________________________________________________
activation_135 (Activation)     (None, 17, 17, 160)  0           batch_normalization_135[0][0]    
__________________________________________________________________________________________________
conv2d_133 (Conv2D)             (None, 17, 17, 192)  208896      block17_14_ac[0][0]              
__________________________________________________________________________________________________
conv2d_136 (Conv2D)             (None, 17, 17, 192)  215040      activation_135[0][0]             
__________________________________________________________________________________________________
batch_normalization_133 (BatchN (None, 17, 17, 192)  576         conv2d_133[0][0]                 
__________________________________________________________________________________________________
batch_normalization_136 (BatchN (None, 17, 17, 192)  576         conv2d_136[0][0]                 
__________________________________________________________________________________________________
activation_133 (Activation)     (None, 17, 17, 192)  0           batch_normalization_133[0][0]    
__________________________________________________________________________________________________
activation_136 (Activation)     (None, 17, 17, 192)  0           batch_normalization_136[0][0]    
__________________________________________________________________________________________________
block17_15_mixed (Concatenate)  (None, 17, 17, 384)  0           activation_133[0][0]             
                                                                 activation_136[0][0]             
__________________________________________________________________________________________________
block17_15_conv (Conv2D)        (None, 17, 17, 1088) 418880      block17_15_mixed[0][0]           
__________________________________________________________________________________________________
block17_15 (Lambda)             (None, 17, 17, 1088) 0           block17_14_ac[0][0]              
                                                                 block17_15_conv[0][0]            
__________________________________________________________________________________________________
block17_15_ac (Activation)      (None, 17, 17, 1088) 0           block17_15[0][0]                 
__________________________________________________________________________________________________
conv2d_138 (Conv2D)             (None, 17, 17, 128)  139264      block17_15_ac[0][0]              
__________________________________________________________________________________________________
batch_normalization_138 (BatchN (None, 17, 17, 128)  384         conv2d_138[0][0]                 
__________________________________________________________________________________________________
activation_138 (Activation)     (None, 17, 17, 128)  0           batch_normalization_138[0][0]    
__________________________________________________________________________________________________
conv2d_139 (Conv2D)             (None, 17, 17, 160)  143360      activation_138[0][0]             
__________________________________________________________________________________________________
batch_normalization_139 (BatchN (None, 17, 17, 160)  480         conv2d_139[0][0]                 
__________________________________________________________________________________________________
activation_139 (Activation)     (None, 17, 17, 160)  0           batch_normalization_139[0][0]    
__________________________________________________________________________________________________
conv2d_137 (Conv2D)             (None, 17, 17, 192)  208896      block17_15_ac[0][0]              
__________________________________________________________________________________________________
conv2d_140 (Conv2D)             (None, 17, 17, 192)  215040      activation_139[0][0]             
__________________________________________________________________________________________________
batch_normalization_137 (BatchN (None, 17, 17, 192)  576         conv2d_137[0][0]                 
__________________________________________________________________________________________________
batch_normalization_140 (BatchN (None, 17, 17, 192)  576         conv2d_140[0][0]                 
__________________________________________________________________________________________________
activation_137 (Activation)     (None, 17, 17, 192)  0           batch_normalization_137[0][0]    
__________________________________________________________________________________________________
activation_140 (Activation)     (None, 17, 17, 192)  0           batch_normalization_140[0][0]    
__________________________________________________________________________________________________
block17_16_mixed (Concatenate)  (None, 17, 17, 384)  0           activation_137[0][0]             
                                                                 activation_140[0][0]             
__________________________________________________________________________________________________
block17_16_conv (Conv2D)        (None, 17, 17, 1088) 418880      block17_16_mixed[0][0]           
__________________________________________________________________________________________________
block17_16 (Lambda)             (None, 17, 17, 1088) 0           block17_15_ac[0][0]              
                                                                 block17_16_conv[0][0]            
__________________________________________________________________________________________________
block17_16_ac (Activation)      (None, 17, 17, 1088) 0           block17_16[0][0]                 
__________________________________________________________________________________________________
conv2d_142 (Conv2D)             (None, 17, 17, 128)  139264      block17_16_ac[0][0]              
__________________________________________________________________________________________________
batch_normalization_142 (BatchN (None, 17, 17, 128)  384         conv2d_142[0][0]                 
__________________________________________________________________________________________________
activation_142 (Activation)     (None, 17, 17, 128)  0           batch_normalization_142[0][0]    
__________________________________________________________________________________________________
conv2d_143 (Conv2D)             (None, 17, 17, 160)  143360      activation_142[0][0]             
__________________________________________________________________________________________________
batch_normalization_143 (BatchN (None, 17, 17, 160)  480         conv2d_143[0][0]                 
__________________________________________________________________________________________________
activation_143 (Activation)     (None, 17, 17, 160)  0           batch_normalization_143[0][0]    
__________________________________________________________________________________________________
conv2d_141 (Conv2D)             (None, 17, 17, 192)  208896      block17_16_ac[0][0]              
__________________________________________________________________________________________________
conv2d_144 (Conv2D)             (None, 17, 17, 192)  215040      activation_143[0][0]             
__________________________________________________________________________________________________
batch_normalization_141 (BatchN (None, 17, 17, 192)  576         conv2d_141[0][0]                 
__________________________________________________________________________________________________
batch_normalization_144 (BatchN (None, 17, 17, 192)  576         conv2d_144[0][0]                 
__________________________________________________________________________________________________
activation_141 (Activation)     (None, 17, 17, 192)  0           batch_normalization_141[0][0]    
__________________________________________________________________________________________________
activation_144 (Activation)     (None, 17, 17, 192)  0           batch_normalization_144[0][0]    
__________________________________________________________________________________________________
block17_17_mixed (Concatenate)  (None, 17, 17, 384)  0           activation_141[0][0]             
                                                                 activation_144[0][0]             
__________________________________________________________________________________________________
block17_17_conv (Conv2D)        (None, 17, 17, 1088) 418880      block17_17_mixed[0][0]           
__________________________________________________________________________________________________
block17_17 (Lambda)             (None, 17, 17, 1088) 0           block17_16_ac[0][0]              
                                                                 block17_17_conv[0][0]            
__________________________________________________________________________________________________
block17_17_ac (Activation)      (None, 17, 17, 1088) 0           block17_17[0][0]                 
__________________________________________________________________________________________________
conv2d_146 (Conv2D)             (None, 17, 17, 128)  139264      block17_17_ac[0][0]              
__________________________________________________________________________________________________
batch_normalization_146 (BatchN (None, 17, 17, 128)  384         conv2d_146[0][0]                 
__________________________________________________________________________________________________
activation_146 (Activation)     (None, 17, 17, 128)  0           batch_normalization_146[0][0]    
__________________________________________________________________________________________________
conv2d_147 (Conv2D)             (None, 17, 17, 160)  143360      activation_146[0][0]             
__________________________________________________________________________________________________
batch_normalization_147 (BatchN (None, 17, 17, 160)  480         conv2d_147[0][0]                 
__________________________________________________________________________________________________
activation_147 (Activation)     (None, 17, 17, 160)  0           batch_normalization_147[0][0]    
__________________________________________________________________________________________________
conv2d_145 (Conv2D)             (None, 17, 17, 192)  208896      block17_17_ac[0][0]              
__________________________________________________________________________________________________
conv2d_148 (Conv2D)             (None, 17, 17, 192)  215040      activation_147[0][0]             
__________________________________________________________________________________________________
batch_normalization_145 (BatchN (None, 17, 17, 192)  576         conv2d_145[0][0]                 
__________________________________________________________________________________________________
batch_normalization_148 (BatchN (None, 17, 17, 192)  576         conv2d_148[0][0]                 
__________________________________________________________________________________________________
activation_145 (Activation)     (None, 17, 17, 192)  0           batch_normalization_145[0][0]    
__________________________________________________________________________________________________
activation_148 (Activation)     (None, 17, 17, 192)  0           batch_normalization_148[0][0]    
__________________________________________________________________________________________________
block17_18_mixed (Concatenate)  (None, 17, 17, 384)  0           activation_145[0][0]             
                                                                 activation_148[0][0]             
__________________________________________________________________________________________________
block17_18_conv (Conv2D)        (None, 17, 17, 1088) 418880      block17_18_mixed[0][0]           
__________________________________________________________________________________________________
block17_18 (Lambda)             (None, 17, 17, 1088) 0           block17_17_ac[0][0]              
                                                                 block17_18_conv[0][0]            
__________________________________________________________________________________________________
block17_18_ac (Activation)      (None, 17, 17, 1088) 0           block17_18[0][0]                 
__________________________________________________________________________________________________
conv2d_150 (Conv2D)             (None, 17, 17, 128)  139264      block17_18_ac[0][0]              
__________________________________________________________________________________________________
batch_normalization_150 (BatchN (None, 17, 17, 128)  384         conv2d_150[0][0]                 
__________________________________________________________________________________________________
activation_150 (Activation)     (None, 17, 17, 128)  0           batch_normalization_150[0][0]    
__________________________________________________________________________________________________
conv2d_151 (Conv2D)             (None, 17, 17, 160)  143360      activation_150[0][0]             
__________________________________________________________________________________________________
batch_normalization_151 (BatchN (None, 17, 17, 160)  480         conv2d_151[0][0]                 
__________________________________________________________________________________________________
activation_151 (Activation)     (None, 17, 17, 160)  0           batch_normalization_151[0][0]    
__________________________________________________________________________________________________
conv2d_149 (Conv2D)             (None, 17, 17, 192)  208896      block17_18_ac[0][0]              
__________________________________________________________________________________________________
conv2d_152 (Conv2D)             (None, 17, 17, 192)  215040      activation_151[0][0]             
__________________________________________________________________________________________________
batch_normalization_149 (BatchN (None, 17, 17, 192)  576         conv2d_149[0][0]                 
__________________________________________________________________________________________________
batch_normalization_152 (BatchN (None, 17, 17, 192)  576         conv2d_152[0][0]                 
__________________________________________________________________________________________________
activation_149 (Activation)     (None, 17, 17, 192)  0           batch_normalization_149[0][0]    
__________________________________________________________________________________________________
activation_152 (Activation)     (None, 17, 17, 192)  0           batch_normalization_152[0][0]    
__________________________________________________________________________________________________
block17_19_mixed (Concatenate)  (None, 17, 17, 384)  0           activation_149[0][0]             
                                                                 activation_152[0][0]             
__________________________________________________________________________________________________
block17_19_conv (Conv2D)        (None, 17, 17, 1088) 418880      block17_19_mixed[0][0]           
__________________________________________________________________________________________________
block17_19 (Lambda)             (None, 17, 17, 1088) 0           block17_18_ac[0][0]              
                                                                 block17_19_conv[0][0]            
__________________________________________________________________________________________________
block17_19_ac (Activation)      (None, 17, 17, 1088) 0           block17_19[0][0]                 
__________________________________________________________________________________________________
conv2d_154 (Conv2D)             (None, 17, 17, 128)  139264      block17_19_ac[0][0]              
__________________________________________________________________________________________________
batch_normalization_154 (BatchN (None, 17, 17, 128)  384         conv2d_154[0][0]                 
__________________________________________________________________________________________________
activation_154 (Activation)     (None, 17, 17, 128)  0           batch_normalization_154[0][0]    
__________________________________________________________________________________________________
conv2d_155 (Conv2D)             (None, 17, 17, 160)  143360      activation_154[0][0]             
__________________________________________________________________________________________________
batch_normalization_155 (BatchN (None, 17, 17, 160)  480         conv2d_155[0][0]                 
__________________________________________________________________________________________________
activation_155 (Activation)     (None, 17, 17, 160)  0           batch_normalization_155[0][0]    
__________________________________________________________________________________________________
conv2d_153 (Conv2D)             (None, 17, 17, 192)  208896      block17_19_ac[0][0]              
__________________________________________________________________________________________________
conv2d_156 (Conv2D)             (None, 17, 17, 192)  215040      activation_155[0][0]             
__________________________________________________________________________________________________
batch_normalization_153 (BatchN (None, 17, 17, 192)  576         conv2d_153[0][0]                 
__________________________________________________________________________________________________
batch_normalization_156 (BatchN (None, 17, 17, 192)  576         conv2d_156[0][0]                 
__________________________________________________________________________________________________
activation_153 (Activation)     (None, 17, 17, 192)  0           batch_normalization_153[0][0]    
__________________________________________________________________________________________________
activation_156 (Activation)     (None, 17, 17, 192)  0           batch_normalization_156[0][0]    
__________________________________________________________________________________________________
block17_20_mixed (Concatenate)  (None, 17, 17, 384)  0           activation_153[0][0]             
                                                                 activation_156[0][0]             
__________________________________________________________________________________________________
block17_20_conv (Conv2D)        (None, 17, 17, 1088) 418880      block17_20_mixed[0][0]           
__________________________________________________________________________________________________
block17_20 (Lambda)             (None, 17, 17, 1088) 0           block17_19_ac[0][0]              
                                                                 block17_20_conv[0][0]            
__________________________________________________________________________________________________
block17_20_ac (Activation)      (None, 17, 17, 1088) 0           block17_20[0][0]                 
__________________________________________________________________________________________________
conv2d_161 (Conv2D)             (None, 17, 17, 256)  278528      block17_20_ac[0][0]              
__________________________________________________________________________________________________
batch_normalization_161 (BatchN (None, 17, 17, 256)  768         conv2d_161[0][0]                 
__________________________________________________________________________________________________
activation_161 (Activation)     (None, 17, 17, 256)  0           batch_normalization_161[0][0]    
__________________________________________________________________________________________________
conv2d_157 (Conv2D)             (None, 17, 17, 256)  278528      block17_20_ac[0][0]              
__________________________________________________________________________________________________
conv2d_159 (Conv2D)             (None, 17, 17, 256)  278528      block17_20_ac[0][0]              
__________________________________________________________________________________________________
conv2d_162 (Conv2D)             (None, 17, 17, 288)  663552      activation_161[0][0]             
__________________________________________________________________________________________________
batch_normalization_157 (BatchN (None, 17, 17, 256)  768         conv2d_157[0][0]                 
__________________________________________________________________________________________________
batch_normalization_159 (BatchN (None, 17, 17, 256)  768         conv2d_159[0][0]                 
__________________________________________________________________________________________________
batch_normalization_162 (BatchN (None, 17, 17, 288)  864         conv2d_162[0][0]                 
__________________________________________________________________________________________________
activation_157 (Activation)     (None, 17, 17, 256)  0           batch_normalization_157[0][0]    
__________________________________________________________________________________________________
activation_159 (Activation)     (None, 17, 17, 256)  0           batch_normalization_159[0][0]    
__________________________________________________________________________________________________
activation_162 (Activation)     (None, 17, 17, 288)  0           batch_normalization_162[0][0]    
__________________________________________________________________________________________________
conv2d_158 (Conv2D)             (None, 8, 8, 384)    884736      activation_157[0][0]             
__________________________________________________________________________________________________
conv2d_160 (Conv2D)             (None, 8, 8, 288)    663552      activation_159[0][0]             
__________________________________________________________________________________________________
conv2d_163 (Conv2D)             (None, 8, 8, 320)    829440      activation_162[0][0]             
__________________________________________________________________________________________________
batch_normalization_158 (BatchN (None, 8, 8, 384)    1152        conv2d_158[0][0]                 
__________________________________________________________________________________________________
batch_normalization_160 (BatchN (None, 8, 8, 288)    864         conv2d_160[0][0]                 
__________________________________________________________________________________________________
batch_normalization_163 (BatchN (None, 8, 8, 320)    960         conv2d_163[0][0]                 
__________________________________________________________________________________________________
activation_158 (Activation)     (None, 8, 8, 384)    0           batch_normalization_158[0][0]    
__________________________________________________________________________________________________
activation_160 (Activation)     (None, 8, 8, 288)    0           batch_normalization_160[0][0]    
__________________________________________________________________________________________________
activation_163 (Activation)     (None, 8, 8, 320)    0           batch_normalization_163[0][0]    
__________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D)  (None, 8, 8, 1088)   0           block17_20_ac[0][0]              
__________________________________________________________________________________________________
mixed_7a (Concatenate)          (None, 8, 8, 2080)   0           activation_158[0][0]             
                                                                 activation_160[0][0]             
                                                                 activation_163[0][0]             
                                                                 max_pooling2d_4[0][0]            
__________________________________________________________________________________________________
conv2d_165 (Conv2D)             (None, 8, 8, 192)    399360      mixed_7a[0][0]                   
__________________________________________________________________________________________________
batch_normalization_165 (BatchN (None, 8, 8, 192)    576         conv2d_165[0][0]                 
__________________________________________________________________________________________________
activation_165 (Activation)     (None, 8, 8, 192)    0           batch_normalization_165[0][0]    
__________________________________________________________________________________________________
conv2d_166 (Conv2D)             (None, 8, 8, 224)    129024      activation_165[0][0]             
__________________________________________________________________________________________________
batch_normalization_166 (BatchN (None, 8, 8, 224)    672         conv2d_166[0][0]                 
__________________________________________________________________________________________________
activation_166 (Activation)     (None, 8, 8, 224)    0           batch_normalization_166[0][0]    
__________________________________________________________________________________________________
conv2d_164 (Conv2D)             (None, 8, 8, 192)    399360      mixed_7a[0][0]                   
__________________________________________________________________________________________________
conv2d_167 (Conv2D)             (None, 8, 8, 256)    172032      activation_166[0][0]             
__________________________________________________________________________________________________
batch_normalization_164 (BatchN (None, 8, 8, 192)    576         conv2d_164[0][0]                 
__________________________________________________________________________________________________
batch_normalization_167 (BatchN (None, 8, 8, 256)    768         conv2d_167[0][0]                 
__________________________________________________________________________________________________
activation_164 (Activation)     (None, 8, 8, 192)    0           batch_normalization_164[0][0]    
__________________________________________________________________________________________________
activation_167 (Activation)     (None, 8, 8, 256)    0           batch_normalization_167[0][0]    
__________________________________________________________________________________________________
block8_1_mixed (Concatenate)    (None, 8, 8, 448)    0           activation_164[0][0]             
                                                                 activation_167[0][0]             
__________________________________________________________________________________________________
block8_1_conv (Conv2D)          (None, 8, 8, 2080)   933920      block8_1_mixed[0][0]             
__________________________________________________________________________________________________
block8_1 (Lambda)               (None, 8, 8, 2080)   0           mixed_7a[0][0]                   
                                                                 block8_1_conv[0][0]              
__________________________________________________________________________________________________
block8_1_ac (Activation)        (None, 8, 8, 2080)   0           block8_1[0][0]                   
__________________________________________________________________________________________________
conv2d_169 (Conv2D)             (None, 8, 8, 192)    399360      block8_1_ac[0][0]                
__________________________________________________________________________________________________
batch_normalization_169 (BatchN (None, 8, 8, 192)    576         conv2d_169[0][0]                 
__________________________________________________________________________________________________
activation_169 (Activation)     (None, 8, 8, 192)    0           batch_normalization_169[0][0]    
__________________________________________________________________________________________________
conv2d_170 (Conv2D)             (None, 8, 8, 224)    129024      activation_169[0][0]             
__________________________________________________________________________________________________
batch_normalization_170 (BatchN (None, 8, 8, 224)    672         conv2d_170[0][0]                 
__________________________________________________________________________________________________
activation_170 (Activation)     (None, 8, 8, 224)    0           batch_normalization_170[0][0]    
__________________________________________________________________________________________________
conv2d_168 (Conv2D)             (None, 8, 8, 192)    399360      block8_1_ac[0][0]                
__________________________________________________________________________________________________
conv2d_171 (Conv2D)             (None, 8, 8, 256)    172032      activation_170[0][0]             
__________________________________________________________________________________________________
batch_normalization_168 (BatchN (None, 8, 8, 192)    576         conv2d_168[0][0]                 
__________________________________________________________________________________________________
batch_normalization_171 (BatchN (None, 8, 8, 256)    768         conv2d_171[0][0]                 
__________________________________________________________________________________________________
activation_168 (Activation)     (None, 8, 8, 192)    0           batch_normalization_168[0][0]    
__________________________________________________________________________________________________
activation_171 (Activation)     (None, 8, 8, 256)    0           batch_normalization_171[0][0]    
__________________________________________________________________________________________________
block8_2_mixed (Concatenate)    (None, 8, 8, 448)    0           activation_168[0][0]             
                                                                 activation_171[0][0]             
__________________________________________________________________________________________________
block8_2_conv (Conv2D)          (None, 8, 8, 2080)   933920      block8_2_mixed[0][0]             
__________________________________________________________________________________________________
block8_2 (Lambda)               (None, 8, 8, 2080)   0           block8_1_ac[0][0]                
                                                                 block8_2_conv[0][0]              
__________________________________________________________________________________________________
block8_2_ac (Activation)        (None, 8, 8, 2080)   0           block8_2[0][0]                   
__________________________________________________________________________________________________
conv2d_173 (Conv2D)             (None, 8, 8, 192)    399360      block8_2_ac[0][0]                
__________________________________________________________________________________________________
batch_normalization_173 (BatchN (None, 8, 8, 192)    576         conv2d_173[0][0]                 
__________________________________________________________________________________________________
activation_173 (Activation)     (None, 8, 8, 192)    0           batch_normalization_173[0][0]    
__________________________________________________________________________________________________
conv2d_174 (Conv2D)             (None, 8, 8, 224)    129024      activation_173[0][0]             
__________________________________________________________________________________________________
batch_normalization_174 (BatchN (None, 8, 8, 224)    672         conv2d_174[0][0]                 
__________________________________________________________________________________________________
activation_174 (Activation)     (None, 8, 8, 224)    0           batch_normalization_174[0][0]    
__________________________________________________________________________________________________
conv2d_172 (Conv2D)             (None, 8, 8, 192)    399360      block8_2_ac[0][0]                
__________________________________________________________________________________________________
conv2d_175 (Conv2D)             (None, 8, 8, 256)    172032      activation_174[0][0]             
__________________________________________________________________________________________________
batch_normalization_172 (BatchN (None, 8, 8, 192)    576         conv2d_172[0][0]                 
__________________________________________________________________________________________________
batch_normalization_175 (BatchN (None, 8, 8, 256)    768         conv2d_175[0][0]                 
__________________________________________________________________________________________________
activation_172 (Activation)     (None, 8, 8, 192)    0           batch_normalization_172[0][0]    
__________________________________________________________________________________________________
activation_175 (Activation)     (None, 8, 8, 256)    0           batch_normalization_175[0][0]    
__________________________________________________________________________________________________
block8_3_mixed (Concatenate)    (None, 8, 8, 448)    0           activation_172[0][0]             
                                                                 activation_175[0][0]             
__________________________________________________________________________________________________
block8_3_conv (Conv2D)          (None, 8, 8, 2080)   933920      block8_3_mixed[0][0]             
__________________________________________________________________________________________________
block8_3 (Lambda)               (None, 8, 8, 2080)   0           block8_2_ac[0][0]                
                                                                 block8_3_conv[0][0]              
__________________________________________________________________________________________________
block8_3_ac (Activation)        (None, 8, 8, 2080)   0           block8_3[0][0]                   
__________________________________________________________________________________________________
conv2d_177 (Conv2D)             (None, 8, 8, 192)    399360      block8_3_ac[0][0]                
__________________________________________________________________________________________________
batch_normalization_177 (BatchN (None, 8, 8, 192)    576         conv2d_177[0][0]                 
__________________________________________________________________________________________________
activation_177 (Activation)     (None, 8, 8, 192)    0           batch_normalization_177[0][0]    
__________________________________________________________________________________________________
conv2d_178 (Conv2D)             (None, 8, 8, 224)    129024      activation_177[0][0]             
__________________________________________________________________________________________________
batch_normalization_178 (BatchN (None, 8, 8, 224)    672         conv2d_178[0][0]                 
__________________________________________________________________________________________________
activation_178 (Activation)     (None, 8, 8, 224)    0           batch_normalization_178[0][0]    
__________________________________________________________________________________________________
conv2d_176 (Conv2D)             (None, 8, 8, 192)    399360      block8_3_ac[0][0]                
__________________________________________________________________________________________________
conv2d_179 (Conv2D)             (None, 8, 8, 256)    172032      activation_178[0][0]             
__________________________________________________________________________________________________
batch_normalization_176 (BatchN (None, 8, 8, 192)    576         conv2d_176[0][0]                 
__________________________________________________________________________________________________
batch_normalization_179 (BatchN (None, 8, 8, 256)    768         conv2d_179[0][0]                 
__________________________________________________________________________________________________
activation_176 (Activation)     (None, 8, 8, 192)    0           batch_normalization_176[0][0]    
__________________________________________________________________________________________________
activation_179 (Activation)     (None, 8, 8, 256)    0           batch_normalization_179[0][0]    
__________________________________________________________________________________________________
block8_4_mixed (Concatenate)    (None, 8, 8, 448)    0           activation_176[0][0]             
                                                                 activation_179[0][0]             
__________________________________________________________________________________________________
block8_4_conv (Conv2D)          (None, 8, 8, 2080)   933920      block8_4_mixed[0][0]             
__________________________________________________________________________________________________
block8_4 (Lambda)               (None, 8, 8, 2080)   0           block8_3_ac[0][0]                
                                                                 block8_4_conv[0][0]              
__________________________________________________________________________________________________
block8_4_ac (Activation)        (None, 8, 8, 2080)   0           block8_4[0][0]                   
__________________________________________________________________________________________________
conv2d_181 (Conv2D)             (None, 8, 8, 192)    399360      block8_4_ac[0][0]                
__________________________________________________________________________________________________
batch_normalization_181 (BatchN (None, 8, 8, 192)    576         conv2d_181[0][0]                 
__________________________________________________________________________________________________
activation_181 (Activation)     (None, 8, 8, 192)    0           batch_normalization_181[0][0]    
__________________________________________________________________________________________________
conv2d_182 (Conv2D)             (None, 8, 8, 224)    129024      activation_181[0][0]             
__________________________________________________________________________________________________
batch_normalization_182 (BatchN (None, 8, 8, 224)    672         conv2d_182[0][0]                 
__________________________________________________________________________________________________
activation_182 (Activation)     (None, 8, 8, 224)    0           batch_normalization_182[0][0]    
__________________________________________________________________________________________________
conv2d_180 (Conv2D)             (None, 8, 8, 192)    399360      block8_4_ac[0][0]                
__________________________________________________________________________________________________
conv2d_183 (Conv2D)             (None, 8, 8, 256)    172032      activation_182[0][0]             
__________________________________________________________________________________________________
batch_normalization_180 (BatchN (None, 8, 8, 192)    576         conv2d_180[0][0]                 
__________________________________________________________________________________________________
batch_normalization_183 (BatchN (None, 8, 8, 256)    768         conv2d_183[0][0]                 
__________________________________________________________________________________________________
activation_180 (Activation)     (None, 8, 8, 192)    0           batch_normalization_180[0][0]    
__________________________________________________________________________________________________
activation_183 (Activation)     (None, 8, 8, 256)    0           batch_normalization_183[0][0]    
__________________________________________________________________________________________________
block8_5_mixed (Concatenate)    (None, 8, 8, 448)    0           activation_180[0][0]             
                                                                 activation_183[0][0]             
__________________________________________________________________________________________________
block8_5_conv (Conv2D)          (None, 8, 8, 2080)   933920      block8_5_mixed[0][0]             
__________________________________________________________________________________________________
block8_5 (Lambda)               (None, 8, 8, 2080)   0           block8_4_ac[0][0]                
                                                                 block8_5_conv[0][0]              
__________________________________________________________________________________________________
block8_5_ac (Activation)        (None, 8, 8, 2080)   0           block8_5[0][0]                   
__________________________________________________________________________________________________
conv2d_185 (Conv2D)             (None, 8, 8, 192)    399360      block8_5_ac[0][0]                
__________________________________________________________________________________________________
batch_normalization_185 (BatchN (None, 8, 8, 192)    576         conv2d_185[0][0]                 
__________________________________________________________________________________________________
activation_185 (Activation)     (None, 8, 8, 192)    0           batch_normalization_185[0][0]    
__________________________________________________________________________________________________
conv2d_186 (Conv2D)             (None, 8, 8, 224)    129024      activation_185[0][0]             
__________________________________________________________________________________________________
batch_normalization_186 (BatchN (None, 8, 8, 224)    672         conv2d_186[0][0]                 
__________________________________________________________________________________________________
activation_186 (Activation)     (None, 8, 8, 224)    0           batch_normalization_186[0][0]    
__________________________________________________________________________________________________
conv2d_184 (Conv2D)             (None, 8, 8, 192)    399360      block8_5_ac[0][0]                
__________________________________________________________________________________________________
conv2d_187 (Conv2D)             (None, 8, 8, 256)    172032      activation_186[0][0]             
__________________________________________________________________________________________________
batch_normalization_184 (BatchN (None, 8, 8, 192)    576         conv2d_184[0][0]                 
__________________________________________________________________________________________________
batch_normalization_187 (BatchN (None, 8, 8, 256)    768         conv2d_187[0][0]                 
__________________________________________________________________________________________________
activation_184 (Activation)     (None, 8, 8, 192)    0           batch_normalization_184[0][0]    
__________________________________________________________________________________________________
activation_187 (Activation)     (None, 8, 8, 256)    0           batch_normalization_187[0][0]    
__________________________________________________________________________________________________
block8_6_mixed (Concatenate)    (None, 8, 8, 448)    0           activation_184[0][0]             
                                                                 activation_187[0][0]             
__________________________________________________________________________________________________
block8_6_conv (Conv2D)          (None, 8, 8, 2080)   933920      block8_6_mixed[0][0]             
__________________________________________________________________________________________________
block8_6 (Lambda)               (None, 8, 8, 2080)   0           block8_5_ac[0][0]                
                                                                 block8_6_conv[0][0]              
__________________________________________________________________________________________________
block8_6_ac (Activation)        (None, 8, 8, 2080)   0           block8_6[0][0]                   
__________________________________________________________________________________________________
conv2d_189 (Conv2D)             (None, 8, 8, 192)    399360      block8_6_ac[0][0]                
__________________________________________________________________________________________________
batch_normalization_189 (BatchN (None, 8, 8, 192)    576         conv2d_189[0][0]                 
__________________________________________________________________________________________________
activation_189 (Activation)     (None, 8, 8, 192)    0           batch_normalization_189[0][0]    
__________________________________________________________________________________________________
conv2d_190 (Conv2D)             (None, 8, 8, 224)    129024      activation_189[0][0]             
__________________________________________________________________________________________________
batch_normalization_190 (BatchN (None, 8, 8, 224)    672         conv2d_190[0][0]                 
__________________________________________________________________________________________________
activation_190 (Activation)     (None, 8, 8, 224)    0           batch_normalization_190[0][0]    
__________________________________________________________________________________________________
conv2d_188 (Conv2D)             (None, 8, 8, 192)    399360      block8_6_ac[0][0]                
__________________________________________________________________________________________________
conv2d_191 (Conv2D)             (None, 8, 8, 256)    172032      activation_190[0][0]             
__________________________________________________________________________________________________
batch_normalization_188 (BatchN (None, 8, 8, 192)    576         conv2d_188[0][0]                 
__________________________________________________________________________________________________
batch_normalization_191 (BatchN (None, 8, 8, 256)    768         conv2d_191[0][0]                 
__________________________________________________________________________________________________
activation_188 (Activation)     (None, 8, 8, 192)    0           batch_normalization_188[0][0]    
__________________________________________________________________________________________________
activation_191 (Activation)     (None, 8, 8, 256)    0           batch_normalization_191[0][0]    
__________________________________________________________________________________________________
block8_7_mixed (Concatenate)    (None, 8, 8, 448)    0           activation_188[0][0]             
                                                                 activation_191[0][0]             
__________________________________________________________________________________________________
block8_7_conv (Conv2D)          (None, 8, 8, 2080)   933920      block8_7_mixed[0][0]             
__________________________________________________________________________________________________
block8_7 (Lambda)               (None, 8, 8, 2080)   0           block8_6_ac[0][0]                
                                                                 block8_7_conv[0][0]              
__________________________________________________________________________________________________
block8_7_ac (Activation)        (None, 8, 8, 2080)   0           block8_7[0][0]                   
__________________________________________________________________________________________________
conv2d_193 (Conv2D)             (None, 8, 8, 192)    399360      block8_7_ac[0][0]                
__________________________________________________________________________________________________
batch_normalization_193 (BatchN (None, 8, 8, 192)    576         conv2d_193[0][0]                 
__________________________________________________________________________________________________
activation_193 (Activation)     (None, 8, 8, 192)    0           batch_normalization_193[0][0]    
__________________________________________________________________________________________________
conv2d_194 (Conv2D)             (None, 8, 8, 224)    129024      activation_193[0][0]             
__________________________________________________________________________________________________
batch_normalization_194 (BatchN (None, 8, 8, 224)    672         conv2d_194[0][0]                 
__________________________________________________________________________________________________
activation_194 (Activation)     (None, 8, 8, 224)    0           batch_normalization_194[0][0]    
__________________________________________________________________________________________________
conv2d_192 (Conv2D)             (None, 8, 8, 192)    399360      block8_7_ac[0][0]                
__________________________________________________________________________________________________
conv2d_195 (Conv2D)             (None, 8, 8, 256)    172032      activation_194[0][0]             
__________________________________________________________________________________________________
batch_normalization_192 (BatchN (None, 8, 8, 192)    576         conv2d_192[0][0]                 
__________________________________________________________________________________________________
batch_normalization_195 (BatchN (None, 8, 8, 256)    768         conv2d_195[0][0]                 
__________________________________________________________________________________________________
activation_192 (Activation)     (None, 8, 8, 192)    0           batch_normalization_192[0][0]    
__________________________________________________________________________________________________
activation_195 (Activation)     (None, 8, 8, 256)    0           batch_normalization_195[0][0]    
__________________________________________________________________________________________________
block8_8_mixed (Concatenate)    (None, 8, 8, 448)    0           activation_192[0][0]             
                                                                 activation_195[0][0]             
__________________________________________________________________________________________________
block8_8_conv (Conv2D)          (None, 8, 8, 2080)   933920      block8_8_mixed[0][0]             
__________________________________________________________________________________________________
block8_8 (Lambda)               (None, 8, 8, 2080)   0           block8_7_ac[0][0]                
                                                                 block8_8_conv[0][0]              
__________________________________________________________________________________________________
block8_8_ac (Activation)        (None, 8, 8, 2080)   0           block8_8[0][0]                   
__________________________________________________________________________________________________
conv2d_197 (Conv2D)             (None, 8, 8, 192)    399360      block8_8_ac[0][0]                
__________________________________________________________________________________________________
batch_normalization_197 (BatchN (None, 8, 8, 192)    576         conv2d_197[0][0]                 
__________________________________________________________________________________________________
activation_197 (Activation)     (None, 8, 8, 192)    0           batch_normalization_197[0][0]    
__________________________________________________________________________________________________
conv2d_198 (Conv2D)             (None, 8, 8, 224)    129024      activation_197[0][0]             
__________________________________________________________________________________________________
batch_normalization_198 (BatchN (None, 8, 8, 224)    672         conv2d_198[0][0]                 
__________________________________________________________________________________________________
activation_198 (Activation)     (None, 8, 8, 224)    0           batch_normalization_198[0][0]    
__________________________________________________________________________________________________
conv2d_196 (Conv2D)             (None, 8, 8, 192)    399360      block8_8_ac[0][0]                
__________________________________________________________________________________________________
conv2d_199 (Conv2D)             (None, 8, 8, 256)    172032      activation_198[0][0]             
__________________________________________________________________________________________________
batch_normalization_196 (BatchN (None, 8, 8, 192)    576         conv2d_196[0][0]                 
__________________________________________________________________________________________________
batch_normalization_199 (BatchN (None, 8, 8, 256)    768         conv2d_199[0][0]                 
__________________________________________________________________________________________________
activation_196 (Activation)     (None, 8, 8, 192)    0           batch_normalization_196[0][0]    
__________________________________________________________________________________________________
activation_199 (Activation)     (None, 8, 8, 256)    0           batch_normalization_199[0][0]    
__________________________________________________________________________________________________
block8_9_mixed (Concatenate)    (None, 8, 8, 448)    0           activation_196[0][0]             
                                                                 activation_199[0][0]             
__________________________________________________________________________________________________
block8_9_conv (Conv2D)          (None, 8, 8, 2080)   933920      block8_9_mixed[0][0]             
__________________________________________________________________________________________________
block8_9 (Lambda)               (None, 8, 8, 2080)   0           block8_8_ac[0][0]                
                                                                 block8_9_conv[0][0]              
__________________________________________________________________________________________________
block8_9_ac (Activation)        (None, 8, 8, 2080)   0           block8_9[0][0]                   
__________________________________________________________________________________________________
conv2d_201 (Conv2D)             (None, 8, 8, 192)    399360      block8_9_ac[0][0]                
__________________________________________________________________________________________________
batch_normalization_201 (BatchN (None, 8, 8, 192)    576         conv2d_201[0][0]                 
__________________________________________________________________________________________________
activation_201 (Activation)     (None, 8, 8, 192)    0           batch_normalization_201[0][0]    
__________________________________________________________________________________________________
conv2d_202 (Conv2D)             (None, 8, 8, 224)    129024      activation_201[0][0]             
__________________________________________________________________________________________________
batch_normalization_202 (BatchN (None, 8, 8, 224)    672         conv2d_202[0][0]                 
__________________________________________________________________________________________________
activation_202 (Activation)     (None, 8, 8, 224)    0           batch_normalization_202[0][0]    
__________________________________________________________________________________________________
conv2d_200 (Conv2D)             (None, 8, 8, 192)    399360      block8_9_ac[0][0]                
__________________________________________________________________________________________________
conv2d_203 (Conv2D)             (None, 8, 8, 256)    172032      activation_202[0][0]             
__________________________________________________________________________________________________
batch_normalization_200 (BatchN (None, 8, 8, 192)    576         conv2d_200[0][0]                 
__________________________________________________________________________________________________
batch_normalization_203 (BatchN (None, 8, 8, 256)    768         conv2d_203[0][0]                 
__________________________________________________________________________________________________
activation_200 (Activation)     (None, 8, 8, 192)    0           batch_normalization_200[0][0]    
__________________________________________________________________________________________________
activation_203 (Activation)     (None, 8, 8, 256)    0           batch_normalization_203[0][0]    
__________________________________________________________________________________________________
block8_10_mixed (Concatenate)   (None, 8, 8, 448)    0           activation_200[0][0]             
                                                                 activation_203[0][0]             
__________________________________________________________________________________________________
block8_10_conv (Conv2D)         (None, 8, 8, 2080)   933920      block8_10_mixed[0][0]            
__________________________________________________________________________________________________
block8_10 (Lambda)              (None, 8, 8, 2080)   0           block8_9_ac[0][0]                
                                                                 block8_10_conv[0][0]             
__________________________________________________________________________________________________
conv_7b (Conv2D)                (None, 8, 8, 1536)   3194880     block8_10[0][0]                  
__________________________________________________________________________________________________
conv_7b_bn (BatchNormalization) (None, 8, 8, 1536)   4608        conv_7b[0][0]                    
__________________________________________________________________________________________________
conv_7b_ac (Activation)         (None, 8, 8, 1536)   0           conv_7b_bn[0][0]                 
__________________________________________________________________________________________________
avg_pool (GlobalAveragePooling2 (None, 1536)         0           conv_7b_ac[0][0]                 
__________________________________________________________________________________________________
predictions (Dense)             (None, 1000)         1537000     avg_pool[0][0]                   
==================================================================================================
Total params: 55,873,736
Trainable params: 55,813,192
Non-trainable params: 60,544
__________________________________________________________________________________________________

Generating visual summary of InceptionResNet version 2


In [45]:
! apt-get install -y graphviz libgraphviz-dev && pip3 install pydot graphviz


Reading package lists... Done
Building dependency tree       
Reading state information... Done
graphviz is already the newest version (2.40.1-2).
libgraphviz-dev is already the newest version (2.40.1-2).
0 upgraded, 0 newly installed, 0 to remove and 8 not upgraded.
Requirement already satisfied: pydot in /usr/local/lib/python3.6/dist-packages (1.3.0)
Requirement already satisfied: graphviz in /usr/local/lib/python3.6/dist-packages (0.10.1)
Requirement already satisfied: pyparsing>=2.1.4 in /usr/local/lib/python3.6/dist-packages (from pydot) (2.3.1)

In [0]:
from keras.utils import plot_model 
import pydot 
import graphviz # apt-get install -y graphviz libgraphviz-dev && pip3 install pydot graphviz 
from IPython.display import SVG 
from keras.utils.vis_utils import model_to_dot

In [47]:
output_dir = './'
plot_model(model_InceptionResNetV2, to_file= output_dir + '/model_summary_plot.png') 
SVG(model_to_dot(model_InceptionResNetV2).create(prog='dot', format='svg'))


Out[47]:
G 140610548293584 input_1: InputLayer 140610548292296 conv2d_1: Conv2D 140610548293584->140610548292296 140610553289192 batch_normalization_1: BatchNormalization 140610548292296->140610553289192 140610553287232 activation_1: Activation 140610553289192->140610553287232 140610695208576 conv2d_2: Conv2D 140610553287232->140610695208576 140610695311880 batch_normalization_2: BatchNormalization 140610695208576->140610695311880 140610695313784 activation_2: Activation 140610695311880->140610695313784 140610497034224 conv2d_3: Conv2D 140610695313784->140610497034224 140610496492208 batch_normalization_3: BatchNormalization 140610497034224->140610496492208 140610496359280 activation_3: Activation 140610496492208->140610496359280 140610495554224 max_pooling2d_1: MaxPooling2D 140610496359280->140610495554224 140610495949232 conv2d_4: Conv2D 140610495554224->140610495949232 140610494947680 batch_normalization_4: BatchNormalization 140610495949232->140610494947680 140610494950928 activation_4: Activation 140610494947680->140610494950928 140610494758696 conv2d_5: Conv2D 140610494950928->140610494758696 140610494626280 batch_normalization_5: BatchNormalization 140610494758696->140610494626280 140610493815944 activation_5: Activation 140610494626280->140610493815944 140610493562216 max_pooling2d_2: MaxPooling2D 140610493815944->140610493562216 140610490381760 conv2d_9: Conv2D 140610493562216->140610490381760 140610492447208 conv2d_7: Conv2D 140610493562216->140610492447208 140610487326144 average_pooling2d_1: AveragePooling2D 140610493562216->140610487326144 140610493966936 conv2d_6: Conv2D 140610493562216->140610493966936 140610490206752 batch_normalization_9: BatchNormalization 140610490381760->140610490206752 140610489621528 activation_9: Activation 140610490206752->140610489621528 140610489387104 conv2d_10: Conv2D 140610489621528->140610489387104 140610492404848 batch_normalization_7: BatchNormalization 140610492447208->140610492404848 140610488836896 batch_normalization_10: BatchNormalization 140610489387104->140610488836896 140610491685632 activation_7: Activation 140610492404848->140610491685632 140610489232352 activation_10: Activation 140610488836896->140610489232352 140610491150064 conv2d_8: Conv2D 140610491685632->140610491150064 140610488411024 conv2d_11: Conv2D 140610489232352->140610488411024 140610487325416 conv2d_12: Conv2D 140610487326144->140610487325416 140610492835880 batch_normalization_6: BatchNormalization 140610493966936->140610492835880 140610491575768 batch_normalization_8: BatchNormalization 140610491150064->140610491575768 140610488020440 batch_normalization_11: BatchNormalization 140610488411024->140610488020440 140610486761400 batch_normalization_12: BatchNormalization 140610487325416->140610486761400 140610492836832 activation_6: Activation 140610492835880->140610492836832 140610491301224 activation_8: Activation 140610491575768->140610491301224 140610487741800 activation_11: Activation 140610488020440->140610487741800 140610486201704 activation_12: Activation 140610486761400->140610486201704 140610485939616 mixed_5b: Concatenate 140610492836832->140610485939616 140610491301224->140610485939616 140610487741800->140610485939616 140610486201704->140610485939616 140610482787888 conv2d_16: Conv2D 140610485939616->140610482787888 140610485222536 conv2d_14: Conv2D 140610485939616->140610485222536 140610486333568 conv2d_13: Conv2D 140610485939616->140610486333568 140610479021192 block35_1: Lambda 140610485939616->140610479021192 140610482625168 batch_normalization_16: BatchNormalization 140610482787888->140610482625168 140610482227240 activation_16: Activation 140610482625168->140610482227240 140610481817808 conv2d_17: Conv2D 140610482227240->140610481817808 140610484798688 batch_normalization_14: BatchNormalization 140610485222536->140610484798688 140610481263504 batch_normalization_17: BatchNormalization 140610481817808->140610481263504 140610484067184 activation_14: Activation 140610484798688->140610484067184 140610481142864 activation_17: Activation 140610481263504->140610481142864 140610483564384 conv2d_15: Conv2D 140610484067184->140610483564384 140610480841728 conv2d_18: Conv2D 140610481142864->140610480841728 140610485574960 batch_normalization_13: BatchNormalization 140610486333568->140610485574960 140610483977800 batch_normalization_15: BatchNormalization 140610483564384->140610483977800 140610480430664 batch_normalization_18: BatchNormalization 140610480841728->140610480430664 140610485748904 activation_13: Activation 140610485574960->140610485748904 140610483708816 activation_15: Activation 140610483977800->140610483708816 140610480165776 activation_18: Activation 140610480430664->140610480165776 140610479765040 block35_1_mixed: Concatenate 140610485748904->140610479765040 140610483708816->140610479765040 140610480165776->140610479765040 140610479766832 block35_1_conv: Conv2D 140610479765040->140610479766832 140610479766832->140610479021192 140610479212472 block35_1_ac: Activation 140610479021192->140610479212472 140610476048168 conv2d_22: Conv2D 140610479212472->140610476048168 140610478506784 conv2d_20: Conv2D 140610479212472->140610478506784 140610479212584 conv2d_19: Conv2D 140610479212472->140610479212584 140610472132280 block35_2: Lambda 140610479212472->140610472132280 140610475878440 batch_normalization_22: BatchNormalization 140610476048168->140610475878440 140610475146936 activation_22: Activation 140610475878440->140610475146936 140610474936136 conv2d_23: Conv2D 140610475146936->140610474936136 140610477925208 batch_normalization_20: BatchNormalization 140610478506784->140610477925208 140610474779368 batch_normalization_23: BatchNormalization 140610474936136->140610474779368 140610477210088 activation_20: Activation 140610477925208->140610477210088 140610474215576 activation_23: Activation 140610474779368->140610474215576 140610476711160 conv2d_21: Conv2D 140610477210088->140610476711160 140610473700600 conv2d_24: Conv2D 140610474215576->140610473700600 140610478402640 batch_normalization_19: BatchNormalization 140610479212584->140610478402640 140610477108416 batch_normalization_21: BatchNormalization 140610476711160->140610477108416 140610473577664 batch_normalization_24: BatchNormalization 140610473700600->140610473577664 140610478400904 activation_19: Activation 140610478402640->140610478400904 140610476837744 activation_21: Activation 140610477108416->140610476837744 140610473306992 activation_24: Activation 140610473577664->140610473306992 140610472501032 block35_2_mixed: Concatenate 140610478400904->140610472501032 140610476837744->140610472501032 140610473306992->140610472501032 140610472584080 block35_2_conv: Conv2D 140610472501032->140610472584080 140610472584080->140610472132280 140610472720200 block35_2_ac: Activation 140610472132280->140610472720200 140610469174688 conv2d_28: Conv2D 140610472720200->140610469174688 140610470769160 conv2d_26: Conv2D 140610472720200->140610470769160 140610472721768 conv2d_25: Conv2D 140610472720200->140610472721768 140610465271088 block35_3: Lambda 140610472720200->140610465271088 140610468992672 batch_normalization_28: BatchNormalization 140610469174688->140610468992672 140610468293936 activation_28: Activation 140610468992672->140610468293936 140610468070848 conv2d_29: Conv2D 140610468293936->140610468070848 140610471201592 batch_normalization_26: BatchNormalization 140610470769160->140610471201592 140610468185256 batch_normalization_29: BatchNormalization 140610468070848->140610468185256 140610470869088 activation_26: Activation 140610471201592->140610470869088 140610467893544 activation_29: Activation 140610468185256->140610467893544 140610470358880 conv2d_27: Conv2D 140610470869088->140610470358880 140610466824032 conv2d_30: Conv2D 140610467893544->140610466824032 140610471516872 batch_normalization_25: BatchNormalization 140610472721768->140610471516872 140610470226464 batch_normalization_27: BatchNormalization 140610470358880->140610470226464 140610467215904 batch_normalization_30: BatchNormalization 140610466824032->140610467215904 140610471515136 activation_25: Activation 140610471516872->140610471515136 140610469424320 activation_27: Activation 140610470226464->140610469424320 140610466413760 activation_30: Activation 140610467215904->140610466413760 140610466160032 block35_3_mixed: Concatenate 140610471515136->140610466160032 140610469424320->140610466160032 140610466413760->140610466160032 140610465718792 block35_3_conv: Conv2D 140610466160032->140610465718792 140610465718792->140610465271088 140610465860688 block35_3_ac: Activation 140610465271088->140610465860688 140610462559872 conv2d_34: Conv2D 140610465860688->140610462559872 140610464652928 conv2d_32: Conv2D 140610465860688->140610464652928 140610465861472 conv2d_31: Conv2D 140610465860688->140610465861472 140610459208672 block35_4: Lambda 140610465860688->140610459208672 140610462156056 batch_normalization_34: BatchNormalization 140610462559872->140610462156056 140610461948840 activation_34: Activation 140610462156056->140610461948840 140610461441944 conv2d_35: Conv2D 140610461948840->140610461441944 140610464855096 batch_normalization_32: BatchNormalization 140610464652928->140610464855096 140610461335168 batch_normalization_35: BatchNormalization 140610461441944->140610461335168 140610463971032 activation_32: Activation 140610464855096->140610463971032 140610461070280 activation_35: Activation 140610461335168->140610461070280 140610463668152 conv2d_33: Conv2D 140610463971032->140610463668152 140610460636552 conv2d_36: Conv2D 140610461070280->140610460636552 140610464651584 batch_normalization_31: BatchNormalization 140610465861472->140610464651584 140610463111320 batch_normalization_33: BatchNormalization 140610463668152->140610463111320 140610459916776 batch_normalization_36: BatchNormalization 140610460636552->140610459916776 140610464649400 activation_31: Activation 140610464651584->140610464649400 140610462919872 activation_33: Activation 140610463111320->140610462919872 140610460051328 activation_36: Activation 140610459916776->140610460051328 140610459629664 block35_4_mixed: Concatenate 140610464649400->140610459629664 140610462919872->140610459629664 140610460051328->140610459629664 140610459495840 block35_4_conv: Conv2D 140610459629664->140610459495840 140610459495840->140610459208672 140610468858736 block35_4_ac: Activation 140610459208672->140610468858736 140610456217584 conv2d_40: Conv2D 140610468858736->140610456217584 140610458000800 conv2d_38: Conv2D 140610468858736->140610458000800 140610459067448 conv2d_37: Conv2D 140610468858736->140610459067448 140610451921216 block35_5: Lambda 140610468858736->140610451921216 140610455671472 batch_normalization_40: BatchNormalization 140610456217584->140610455671472 140610454948160 activation_40: Activation 140610455671472->140610454948160 140610454453328 conv2d_41: Conv2D 140610454948160->140610454453328 140610457890152 batch_normalization_38: BatchNormalization 140610458000800->140610457890152 140610454858776 batch_normalization_41: BatchNormalization 140610454453328->140610454858776 140610457022416 activation_38: Activation 140610457890152->140610457022416 140610454588104 activation_41: Activation 140610454858776->140610454588104 140610456792704 conv2d_39: Conv2D 140610457022416->140610456792704 140610453778048 conv2d_42: Conv2D 140610454588104->140610453778048 140610458211104 batch_normalization_37: BatchNormalization 140610459067448->140610458211104 140610456618880 batch_normalization_39: BatchNormalization 140610456792704->140610456618880 140610453096320 batch_normalization_42: BatchNormalization 140610453778048->140610453096320 140610457805544 activation_37: Activation 140610458211104->140610457805544 140610457009176 activation_39: Activation 140610456618880->140610457009176 140610453482520 activation_42: Activation 140610453096320->140610453482520 140610452674544 block35_5_mixed: Concatenate 140610457805544->140610452674544 140610457009176->140610452674544 140610453482520->140610452674544 140610453077016 block35_5_conv: Conv2D 140610452674544->140610453077016 140610453077016->140610451921216 140610452510816 block35_5_ac: Activation 140610451921216->140610452510816 140610449340008 conv2d_46: Conv2D 140610452510816->140610449340008 140610451106840 conv2d_44: Conv2D 140610452510816->140610451106840 140610452511600 conv2d_43: Conv2D 140610452510816->140610452511600 140610445064120 block35_6: Lambda 140610452510816->140610445064120 140610448921936 batch_normalization_46: BatchNormalization 140610449340008->140610448921936 140610448623544 activation_46: Activation 140610448921936->140610448623544 140610448279704 conv2d_47: Conv2D 140610448623544->140610448279704 140610451504096 batch_normalization_44: BatchNormalization 140610451106840->140610451504096 140610448104808 batch_normalization_47: BatchNormalization 140610448279704->140610448104808 140610451233424 activation_44: Activation 140610451504096->140610451233424 140610447716928 activation_47: Activation 140610448104808->140610447716928 140610450693872 conv2d_45: Conv2D 140610451233424->140610450693872 140610447155096 conv2d_48: Conv2D 140610447716928->140610447155096 140610451737064 batch_normalization_43: BatchNormalization 140610452511600->140610451737064 140610449749496 batch_normalization_45: BatchNormalization 140610450693872->140610449749496 140610446734840 batch_normalization_48: BatchNormalization 140610447155096->140610446734840 140610451300248 activation_43: Activation 140610451737064->140610451300248 140610450140856 activation_45: Activation 140610449749496->140610450140856 140610446606008 activation_48: Activation 140610446734840->140610446606008 140610445801064 block35_6_mixed: Concatenate 140610451300248->140610445801064 140610450140856->140610445801064 140610446606008->140610445801064 140610446191248 block35_6_conv: Conv2D 140610445801064->140610446191248 140610446191248->140610445064120 140610445665896 block35_6_ac: Activation 140610445064120->140610445665896 140610442487008 conv2d_52: Conv2D 140610445665896->140610442487008 140610444953864 conv2d_50: Conv2D 140610445665896->140610444953864 140610445664440 conv2d_49: Conv2D 140610445665896->140610445664440 140610438208200 block35_7: Lambda 140610445665896->140610438208200 140610442096424 batch_normalization_52: BatchNormalization 140610442487008->140610442096424 140610441743048 activation_52: Activation 140610442096424->140610441743048 140610441422608 conv2d_53: Conv2D 140610441743048->140610441422608 140610444257936 batch_normalization_50: BatchNormalization 140610444953864->140610444257936 140610441247264 batch_normalization_53: BatchNormalization 140610441422608->140610441247264 140610444390920 activation_50: Activation 140610444257936->140610444390920 140610440861624 activation_53: Activation 140610441247264->140610440861624 140610443458992 conv2d_51: Conv2D 140610444390920->140610443458992 140610440439960 conv2d_54: Conv2D 140610440861624->140610440439960 140610444864368 batch_normalization_49: BatchNormalization 140610445664440->140610444864368 140610442900592 batch_normalization_51: BatchNormalization 140610443458992->140610442900592 140610439750992 batch_normalization_54: BatchNormalization 140610440439960->140610439750992 140610444864872 activation_49: Activation 140610444864368->140610444864872 140610443291952 activation_51: Activation 140610442900592->140610443291952 140610439950176 activation_54: Activation 140610439750992->140610439950176 140610438939872 block35_7_mixed: Concatenate 140610444864872->140610438939872 140610443291952->140610438939872 140610439950176->140610438939872 140610439158136 block35_7_conv: Conv2D 140610438939872->140610439158136 140610439158136->140610438208200 140610438787144 block35_7_ac: Activation 140610438208200->140610438787144 140610435646296 conv2d_58: Conv2D 140610438787144->140610435646296 140610438084480 conv2d_56: Conv2D 140610438787144->140610438084480 140610438788432 conv2d_55: Conv2D 140610438787144->140610438788432 140610431373264 block35_8: Lambda 140610438787144->140610431373264 140610435734944 batch_normalization_58: BatchNormalization 140610435646296->140610435734944 140610434904016 activation_58: Activation 140610435734944->140610434904016 140610434674360 conv2d_59: Conv2D 140610434904016->140610434674360 140610437372392 batch_normalization_56: BatchNormalization 140610438084480->140610437372392 140610433976248 batch_normalization_59: BatchNormalization 140610434674360->140610433976248 140610437319648 activation_56: Activation 140610437372392->140610437319648 140610434373336 activation_59: Activation 140610433976248->140610434373336 140610437093416 conv2d_57: Conv2D 140610437319648->140610437093416 140610433574952 conv2d_60: Conv2D 140610434373336->140610433574952 140610437978600 batch_normalization_55: BatchNormalization 140610438788432->140610437978600 140610436563688 batch_normalization_57: BatchNormalization 140610437093416->140610436563688 140610433037032 batch_normalization_60: BatchNormalization 140610433574952->140610433037032 140610437979944 activation_55: Activation 140610437978600->140610437979944 140610436418472 activation_57: Activation 140610436563688->140610436418472 140610433412008 activation_60: Activation 140610433037032->140610433412008 140610432635736 block35_8_mixed: Concatenate 140610437979944->140610432635736 140610436418472->140610432635736 140610433412008->140610432635736 140610432327352 block35_8_conv: Conv2D 140610432635736->140610432327352 140610432327352->140610431373264 140610431915344 block35_8_ac: Activation 140610431373264->140610431915344 140610428472400 conv2d_64: Conv2D 140610431915344->140610428472400 140610430811720 conv2d_62: Conv2D 140610431915344->140610430811720 140610431917360 conv2d_61: Conv2D 140610431915344->140610431917360 140610424516224 block35_9: Lambda 140610431915344->140610424516224 140610428869656 batch_normalization_64: BatchNormalization 140610428472400->140610428869656 140610428603080 activation_64: Activation 140610428869656->140610428603080 140610428059432 conv2d_65: Conv2D 140610428603080->140610428059432 140610430642048 batch_normalization_62: BatchNormalization 140610430811720->140610430642048 140610427631152 batch_normalization_65: BatchNormalization 140610428059432->140610427631152 140610430455312 activation_62: Activation 140610430642048->140610430455312 140610427510512 activation_65: Activation 140610427631152->140610427510512 140610430215840 conv2d_63: Conv2D 140610430455312->140610430215840 140610426701472 conv2d_66: Conv2D 140610427510512->140610426701472 140610431127168 batch_normalization_61: BatchNormalization 140610431917360->140610431127168 140610429826440 batch_normalization_63: BatchNormalization 140610430215840->140610429826440 140610426811784 batch_normalization_66: BatchNormalization 140610426701472->140610426811784 140610431125600 activation_61: Activation 140610431127168->140610431125600 140610429567104 activation_63: Activation 140610429826440->140610429567104 140610426548352 activation_66: Activation 140610426811784->140610426548352 140610425449552 block35_9_mixed: Concatenate 140610431125600->140610425449552 140610429567104->140610425449552 140610426548352->140610425449552 140610425629048 block35_9_conv: Conv2D 140610425449552->140610425629048 140610425629048->140610424516224 140610425054208 block35_9_ac: Activation 140610424516224->140610425054208 140610422314824 conv2d_70: Conv2D 140610425054208->140610422314824 140610424358000 conv2d_68: Conv2D 140610425054208->140610424358000 140610425055944 conv2d_67: Conv2D 140610425054208->140610425055944 140610418159344 block35_10: Lambda 140610425054208->140610418159344 140610422123880 batch_normalization_70: BatchNormalization 140610422314824->140610422123880 140610421744192 activation_70: Activation 140610422123880->140610421744192 140610421340648 conv2d_71: Conv2D 140610421744192->140610421340648 140610423772664 batch_normalization_68: BatchNormalization 140610424358000->140610423772664 140610420782248 batch_normalization_71: BatchNormalization 140610421340648->140610420782248 140610423594120 activation_68: Activation 140610423772664->140610423594120 140610420649320 activation_71: Activation 140610420782248->140610420649320 140610423081392 conv2d_69: Conv2D 140610423594120->140610423081392 140610419848472 conv2d_72: Conv2D 140610420649320->140610419848472 140610424774384 batch_normalization_67: BatchNormalization 140610425055944->140610424774384 140610422972256 batch_normalization_69: BatchNormalization 140610423081392->140610422972256 140610419957600 batch_normalization_72: BatchNormalization 140610419848472->140610419957600 140610424772648 activation_67: Activation 140610424774384->140610424772648 140610422699400 activation_69: Activation 140610422972256->140610422699400 140610419684744 activation_72: Activation 140610419957600->140610419684744 140610418763592 block35_10_mixed: Concatenate 140610424772648->140610418763592 140610422699400->140610418763592 140610419684744->140610418763592 140610418765328 block35_10_conv: Conv2D 140610418763592->140610418765328 140610418765328->140610418159344 140610418723024 block35_10_ac: Activation 140610418159344->140610418723024 140610417480760 conv2d_74: Conv2D 140610418723024->140610417480760 140610418721344 conv2d_73: Conv2D 140610418723024->140610418721344 140610414401688 max_pooling2d_3: MaxPooling2D 140610418723024->140610414401688 140610416919664 batch_normalization_74: BatchNormalization 140610417480760->140610416919664 140610416728832 activation_74: Activation 140610416919664->140610416728832 140610416189168 conv2d_75: Conv2D 140610416728832->140610416189168 140610416602584 batch_normalization_75: BatchNormalization 140610416189168->140610416602584 140610416328040 activation_75: Activation 140610416602584->140610416328040 140610415534824 conv2d_76: Conv2D 140610416328040->140610415534824 140610417905000 batch_normalization_73: BatchNormalization 140610418721344->140610417905000 140610415237664 batch_normalization_76: BatchNormalization 140610415534824->140610415237664 140610417903264 activation_73: Activation 140610417905000->140610417903264 140610414831544 activation_76: Activation 140610415237664->140610414831544 140610414292720 mixed_6a: Concatenate 140610417903264->140610414292720 140610414831544->140610414292720 140610414401688->140610414292720 140610413439296 conv2d_78: Conv2D 140610414292720->140610413439296 140610414536912 conv2d_77: Conv2D 140610414292720->140610414536912 140610409155160 block17_1: Lambda 140610414292720->140610409155160 140610412893128 batch_normalization_78: BatchNormalization 140610413439296->140610412893128 140610412702296 activation_78: Activation 140610412893128->140610412702296 140610411674984 conv2d_79: Conv2D 140610412702296->140610411674984 140610412080432 batch_normalization_79: BatchNormalization 140610411674984->140610412080432 140610411791920 activation_79: Activation 140610412080432->140610411791920 140610410882328 conv2d_80: Conv2D 140610411791920->140610410882328 140610413659976 batch_normalization_77: BatchNormalization 140610414536912->140610413659976 140610410711416 batch_normalization_80: BatchNormalization 140610410882328->140610410711416 140610413297560 activation_77: Activation 140610413659976->140610413297560 140610410834072 activation_80: Activation 140610410711416->140610410834072 140610409896200 block17_1_mixed: Concatenate 140610413297560->140610409896200 140610410834072->140610409896200 140610410286384 block17_1_conv: Conv2D 140610409896200->140610410286384 140610410286384->140610409155160 140610409732472 block17_1_ac: Activation 140610409155160->140610409732472 140610408353016 conv2d_82: Conv2D 140610409732472->140610408353016 140610409733256 conv2d_81: Conv2D 140610409732472->140610409733256 140610404368168 block17_2: Lambda 140610409732472->140610404368168 140610408737984 batch_normalization_82: BatchNormalization 140610408353016->140610408737984 140610407911208 activation_82: Activation 140610408737984->140610407911208 140610407677400 conv2d_83: Conv2D 140610407911208->140610407677400 140610406971096 batch_normalization_83: BatchNormalization 140610407677400->140610406971096 140610407381984 activation_83: Activation 140610406971096->140610407381984 140610406582088 conv2d_84: Conv2D 140610407381984->140610406582088 140610408946376 batch_normalization_81: BatchNormalization 140610409733256->140610408946376 140610406036648 batch_normalization_84: BatchNormalization 140610406582088->140610406036648 140610408529200 activation_81: Activation 140610408946376->140610408529200 140610406410168 activation_84: Activation 140610406036648->140610406410168 140610405326072 block17_2_mixed: Concatenate 140610408529200->140610405326072 140610406410168->140610405326072 140610405497376 block17_2_conv: Conv2D 140610405326072->140610405497376 140610405497376->140610404368168 140610404934824 block17_2_ac: Activation 140610404368168->140610404934824 140610403810048 conv2d_86: Conv2D 140610404934824->140610403810048 140610404936560 conv2d_85: Conv2D 140610404934824->140610404936560 140610399919352 block17_3: Lambda 140610404934824->140610399919352 140610403649128 batch_normalization_86: BatchNormalization 140610403810048->140610403649128 140610403462392 activation_86: Activation 140610403649128->140610403462392 140610403108512 conv2d_87: Conv2D 140610403462392->140610403108512 140610402833520 batch_normalization_87: BatchNormalization 140610403108512->140610402833520 140610402570480 activation_87: Activation 140610402833520->140610402570480 140610401468200 conv2d_88: Conv2D 140610402570480->140610401468200 140610404115512 batch_normalization_85: BatchNormalization 140610404936560->140610404115512 140610401855976 batch_normalization_88: BatchNormalization 140610401468200->140610401855976 140610404117304 activation_85: Activation 140610404115512->140610404117304 140610401049736 activation_88: Activation 140610401855976->140610401049736 140610400791912 block17_3_mixed: Concatenate 140610404117304->140610400791912 140610401049736->140610400791912 140610400887248 block17_3_conv: Conv2D 140610400791912->140610400887248 140610400887248->140610399919352 140610400496664 block17_3_ac: Activation 140610399919352->140610400496664 140610399091120 conv2d_90: Conv2D 140610400496664->140610399091120 140610400497448 conv2d_89: Conv2D 140610400496664->140610400497448 140610395638528 block17_4: Lambda 140610400496664->140610395638528 140610399514464 batch_normalization_90: BatchNormalization 140610399091120->140610399514464 140610398644992 activation_90: Activation 140610399514464->140610398644992 140610398316360 conv2d_91: Conv2D 140610398644992->140610398316360 140610398153304 batch_normalization_91: BatchNormalization 140610398316360->140610398153304 140610398287856 activation_91: Activation 140610398153304->140610398287856 140610397736632 conv2d_92: Conv2D 140610398287856->140610397736632 140610399819984 batch_normalization_89: BatchNormalization 140610400497448->140610399819984 140610396796072 batch_normalization_92: BatchNormalization 140610397736632->140610396796072 140610399401184 activation_89: Activation 140610399819984->140610399401184 140610397195624 activation_92: Activation 140610396796072->140610397195624 140610396378392 block17_4_mixed: Concatenate 140610399401184->140610396378392 140610397195624->140610396378392 140610396080560 block17_4_conv: Conv2D 140610396378392->140610396080560 140610396080560->140610395638528 140610395693184 block17_4_ac: Activation 140610395638528->140610395693184 140610394998656 conv2d_94: Conv2D 140610395693184->140610394998656 140610395694472 conv2d_93: Conv2D 140610395693184->140610395694472 140610391219168 block17_5: Lambda 140610395693184->140610391219168 140610394806760 batch_normalization_94: BatchNormalization 140610394998656->140610394806760 140610394229728 activation_94: Activation 140610394806760->140610394229728 140610394011688 conv2d_95: Conv2D 140610394229728->140610394011688 140610393481960 batch_normalization_95: BatchNormalization 140610394011688->140610393481960 140610393336744 activation_95: Activation 140610393481960->140610393336744 140610392552280 conv2d_96: Conv2D 140610393336744->140610392552280 140610394909160 batch_normalization_93: BatchNormalization 140610395694472->140610394909160 140610392645024 batch_normalization_96: BatchNormalization 140610392552280->140610392645024 140610394910504 activation_93: Activation 140610394909160->140610394910504 140610392370480 activation_96: Activation 140610392645024->140610392370480 140610391975304 block17_5_mixed: Concatenate 140610394910504->140610391975304 140610392370480->140610391975304 140610391977096 block17_5_conv: Conv2D 140610391975304->140610391977096 140610391977096->140610391219168 140610391406352 block17_5_ac: Activation 140610391219168->140610391406352 140610390184512 conv2d_98: Conv2D 140610391406352->140610390184512 140610391406464 conv2d_97: Conv2D 140610391406352->140610391406464 140610386401544 block17_6: Lambda 140610391406352->140610386401544 140610390143608 batch_normalization_98: BatchNormalization 140610390184512->140610390143608 140610389416200 activation_98: Activation 140610390143608->140610389416200 140610388913176 conv2d_99: Conv2D 140610389416200->140610388913176 140610389306336 batch_normalization_99: BatchNormalization 140610388913176->140610389306336 140610389043856 activation_99: Activation 140610389306336->140610389043856 140610388246088 conv2d_100: Conv2D 140610389043856->140610388246088 140610390592368 batch_normalization_97: BatchNormalization 140610391406464->140610390592368 140610387560264 batch_normalization_100: BatchNormalization 140610388246088->140610387560264 140610390590632 activation_97: Activation 140610390592368->140610390590632 140610387950560 activation_100: Activation 140610387560264->140610387950560 140610387142584 block17_6_mixed: Concatenate 140610390590632->140610387142584 140610387950560->140610387142584 140610387540960 block17_6_conv: Conv2D 140610387142584->140610387540960 140610387540960->140610386401544 140610386995240 block17_6_ac: Activation 140610386401544->140610386995240 140610385575832 conv2d_102: Conv2D 140610386995240->140610385575832 140610386996024 conv2d_101: Conv2D 140610386995240->140610386996024 140610381627344 block17_7: Lambda 140610386995240->140610381627344 140610385455648 batch_normalization_102: BatchNormalization 140610385575832->140610385455648 140610384800064 activation_102: Activation 140610385455648->140610384800064 140610385771040 conv2d_103: Conv2D 140610384800064->140610385771040 140610384217480 batch_normalization_103: BatchNormalization 140610385771040->140610384217480 140610384637512 activation_103: Activation 140610384217480->140610384637512 140610383824376 conv2d_104: Conv2D 140610384637512->140610383824376 140610386200952 batch_normalization_101: BatchNormalization 140610386996024->140610386200952 140610383406304 batch_normalization_104: BatchNormalization 140610383824376->140610383406304 140610385772104 activation_101: Activation 140610386200952->140610385772104 140610383155328 activation_104: Activation 140610383406304->140610383155328 140610382569368 block17_7_mixed: Concatenate 140610385772104->140610382569368 140610383155328->140610382569368 140610382743592 block17_7_conv: Conv2D 140610382569368->140610382743592 140610382743592->140610381627344 140610382169256 block17_7_ac: Activation 140610381627344->140610382169256 140610381468896 conv2d_106: Conv2D 140610382169256->140610381468896 140610382170544 conv2d_105: Conv2D 140610382169256->140610382170544 140610377182120 block17_8: Lambda 140610382169256->140610377182120 140610380920088 batch_normalization_106: BatchNormalization 140610381468896->140610380920088 140610380712872 activation_106: Activation 140610380920088->140610380712872 140610380205976 conv2d_107: Conv2D 140610380712872->140610380205976 140610380095104 batch_normalization_107: BatchNormalization 140610380205976->140610380095104 140610379830216 activation_107: Activation 140610380095104->140610379830216 140610378901096 conv2d_108: Conv2D 140610379830216->140610378901096 140610381368848 batch_normalization_105: BatchNormalization 140610382170544->140610381368848 140610378872984 batch_normalization_108: BatchNormalization 140610378901096->140610378872984 140610381367112 activation_105: Activation 140610381368848->140610381367112 140610378153152 activation_108: Activation 140610378872984->140610378153152 140610378296960 block17_8_mixed: Concatenate 140610381367112->140610378296960 140610378153152->140610378296960 140610377931016 block17_8_conv: Conv2D 140610378296960->140610377931016 140610377931016->140610377182120 140610377775816 block17_8_ac: Activation 140610377182120->140610377775816 140610376343336 conv2d_110: Conv2D 140610377775816->140610376343336 140610377776600 conv2d_109: Conv2D 140610377775816->140610377776600 140610372401584 block17_9: Lambda 140610377775816->140610372401584 140610376752656 batch_normalization_110: BatchNormalization 140610376343336->140610376752656 140610375924144 activation_110: Activation 140610376752656->140610375924144 140610375696168 conv2d_111: Conv2D 140610375924144->140610375696168 140610375014440 batch_normalization_111: BatchNormalization 140610375696168->140610375014440 140610375392616 activation_111: Activation 140610375014440->140610375392616 140610374600856 conv2d_112: Conv2D 140610375392616->140610374600856 140610376541352 batch_normalization_109: BatchNormalization 140610377776600->140610376541352 140610374046552 batch_normalization_112: BatchNormalization 140610374600856->140610374046552 140610376659856 activation_109: Activation 140610376541352->140610376659856 140610374442008 activation_112: Activation 140610374046552->140610374442008 140610373620680 block17_9_mixed: Concatenate 140610376659856->140610373620680 140610374442008->140610373620680 140610373328680 block17_9_conv: Conv2D 140610373620680->140610373328680 140610373328680->140610372401584 140610372937152 block17_9_ac: Activation 140610372401584->140610372937152 140610371845760 conv2d_114: Conv2D 140610372937152->140610371845760 140610372939168 conv2d_113: Conv2D 140610372937152->140610372939168 140610367958600 block17_10: Lambda 140610372937152->140610367958600 140610371667896 batch_normalization_114: BatchNormalization 140610371845760->140610371667896 140610371489352 activation_114: Activation 140610371667896->140610371489352 140610371249880 conv2d_115: Conv2D 140610371489352->140610371249880 140610370848192 batch_normalization_115: BatchNormalization 140610371249880->140610370848192 140610370572472 activation_115: Activation 140610370848192->140610370572472 140610369490056 conv2d_116: Conv2D 140610370572472->140610369490056 140610372161208 batch_normalization_113: BatchNormalization 140610372939168->140610372161208 140610369891408 batch_normalization_116: BatchNormalization 140610369490056->140610369891408 140610372159640 activation_113: Activation 140610372161208->140610372159640 140610369616640 activation_116: Activation 140610369891408->140610369616640 140610368831160 block17_10_mixed: Concatenate 140610372159640->140610368831160 140610369616640->140610368831160 140610368918304 block17_10_conv: Conv2D 140610368831160->140610368918304 140610368918304->140610367958600 140610368531536 block17_10_ac: Activation 140610367958600->140610368531536 140610367444584 conv2d_118: Conv2D 140610368531536->140610367444584 140610368530136 conv2d_117: Conv2D 140610368531536->140610368530136 140610363668408 block17_11: Lambda 140610368531536->140610363668408 140610367530264 batch_normalization_118: BatchNormalization 140610367444584->140610367530264 140610366674872 activation_118: Activation 140610367530264->140610366674872 140610366343320 conv2d_119: Conv2D 140610366674872->140610366343320 140610366172520 batch_normalization_119: BatchNormalization 140610366343320->140610366172520 140610366308928 activation_119: Activation 140610366172520->140610366308928 140610365759384 conv2d_120: Conv2D 140610366308928->140610365759384 140610367846944 batch_normalization_117: BatchNormalization 140610368530136->140610367846944 140610364790264 batch_normalization_120: BatchNormalization 140610365759384->140610364790264 140610367848288 activation_117: Activation 140610367846944->140610367848288 140610365189816 activation_120: Activation 140610364790264->140610365189816 140610364397160 block17_11_mixed: Concatenate 140610367848288->140610364397160 140610365189816->140610364397160 140610364263056 block17_11_conv: Conv2D 140610364397160->140610364263056 140610364263056->140610363668408 140610363729512 block17_11_ac: Activation 140610363668408->140610363729512 140610363017424 conv2d_122: Conv2D 140610363729512->140610363017424 140610363728056 conv2d_121: Conv2D 140610363729512->140610363728056 140610358873728 block17_12: Lambda 140610363729512->140610358873728 140610362850944 batch_normalization_122: BatchNormalization 140610363017424->140610362850944 140610362438096 activation_122: Activation 140610362850944->140610362438096 140610362034552 conv2d_123: Conv2D 140610362438096->140610362034552 140610361492536 batch_normalization_123: BatchNormalization 140610362034552->140610361492536 140610361359608 activation_123: Activation 140610361492536->140610361359608 140610360571048 conv2d_124: Conv2D 140610361359608->140610360571048 140610362928040 batch_normalization_121: BatchNormalization 140610363728056->140610362928040 140610360676080 batch_normalization_124: BatchNormalization 140610360571048->140610360676080 140610362928936 activation_121: Activation 140610362928040->140610362928936 140610360409728 activation_124: Activation 140610360676080->140610360409728 140610359998168 block17_12_mixed: Concatenate 140610362928936->140610359998168 140610360409728->140610359998168 140610359999960 block17_12_conv: Conv2D 140610359998168->140610359999960 140610359999960->140610358873728 140610359433312 block17_12_ac: Activation 140610358873728->140610359433312 140610358211472 conv2d_126: Conv2D 140610359433312->140610358211472 140610359431632 conv2d_125: Conv2D 140610359433312->140610359431632 140610354432600 block17_13: Lambda 140610359433312->140610354432600 140610358170568 batch_normalization_126: BatchNormalization 140610358211472->140610358170568 140610357455448 activation_126: Activation 140610358170568->140610357455448 140610356956520 conv2d_127: Conv2D 140610357455448->140610356956520 140610357353776 batch_normalization_127: BatchNormalization 140610356956520->140610357353776 140610357073456 activation_127: Activation 140610357353776->140610357073456 140610356147480 conv2d_128: Conv2D 140610357073456->140610356147480 140610358631616 batch_normalization_125: BatchNormalization 140610359431632->140610358631616 140610355980664 batch_normalization_128: BatchNormalization 140610356147480->140610355980664 140610358629880 activation_125: Activation 140610358631616->140610358629880 140610355583128 activation_128: Activation 140610355980664->140610355583128 140610355181832 block17_13_mixed: Concatenate 140610358629880->140610355181832 140610355583128->140610355181832 140610355551536 block17_13_conv: Conv2D 140610355181832->140610355551536 140610355551536->140610354432600 140610355018104 block17_13_ac: Activation 140610354432600->140610355018104 140610353622264 conv2d_130: Conv2D 140610355018104->140610353622264 140610355018888 conv2d_129: Conv2D 140610355018104->140610355018888 140610349645608 block17_14: Lambda 140610355018104->140610349645608 140610353491136 batch_normalization_130: BatchNormalization 140610353622264->140610353491136 140610353184552 activation_130: Activation 140610353491136->140610353184552 140610352426456 conv2d_131: Conv2D 140610353184552->140610352426456 140610352244440 batch_normalization_131: BatchNormalization 140610352426456->140610352244440 140610352643040 activation_131: Activation 140610352244440->140610352643040 140610351855432 conv2d_132: Conv2D 140610352643040->140610351855432 140610354207432 batch_normalization_129: BatchNormalization 140610355018888->140610354207432 140610351309992 batch_normalization_132: BatchNormalization 140610351855432->140610351309992 140610353798448 activation_129: Activation 140610354207432->140610353798448 140610351163320 activation_132: Activation 140610351309992->140610351163320 140610350075128 block17_14_mixed: Concatenate 140610353798448->140610350075128 140610351163320->140610350075128 140610350770720 block17_14_conv: Conv2D 140610350075128->140610350770720 140610350770720->140610349645608 140610350208168 block17_14_ac: Activation 140610349645608->140610350208168 140610349087488 conv2d_134: Conv2D 140610350208168->140610349087488 140610350209904 conv2d_133: Conv2D 140610350208168->140610350209904 140610345196792 block17_15: Lambda 140610350208168->140610345196792 140610348378264 batch_normalization_134: BatchNormalization 140610349087488->140610348378264 140610348000648 activation_134: Activation 140610348378264->140610348000648 140610348379272 conv2d_135: Conv2D 140610348000648->140610348379272 140610348102768 batch_normalization_135: BatchNormalization 140610348379272->140610348102768 140610347847920 activation_135: Activation 140610348102768->140610347847920 140610346741544 conv2d_136: Conv2D 140610347847920->140610346741544 140610349392952 batch_normalization_133: BatchNormalization 140610350209904->140610349392952 140610347133416 batch_normalization_136: BatchNormalization 140610346741544->140610347133416 140610349394744 activation_133: Activation 140610349392952->140610349394744 140610346331272 activation_136: Activation 140610347133416->140610346331272 140610346077544 block17_15_mixed: Concatenate 140610349394744->140610346077544 140610346331272->140610346077544 140610345636304 block17_15_conv: Conv2D 140610346077544->140610345636304 140610345636304->140610345196792 140610345770008 block17_15_ac: Activation 140610345196792->140610345770008 140610344364464 conv2d_138: Conv2D 140610345770008->140610344364464 140610345770792 conv2d_137: Conv2D 140610345770008->140610345770792 140610340387584 block17_16: Lambda 140610345770008->140610340387584 140610344779616 batch_normalization_138: BatchNormalization 140610344364464->140610344779616 140610343910144 activation_138: Activation 140610344779616->140610343910144 140610343610184 conv2d_139: Conv2D 140610343910144->140610343610184 140610343443032 batch_normalization_139: BatchNormalization 140610343610184->140610343443032 140610343049200 activation_139: Activation 140610343443032->140610343049200 140610342493880 conv2d_140: Conv2D 140610343049200->140610342493880 140610344560848 batch_normalization_137: BatchNormalization 140610345770792->140610344560848 140610342077608 batch_normalization_140: BatchNormalization 140610342493880->140610342077608 140610344674528 activation_137: Activation 140610344560848->140610344674528 140610341944680 activation_140: Activation 140610342077608->140610341944680 140610341643544 block17_16_mixed: Concatenate 140610344674528->140610341643544 140610341944680->140610341643544 140610341349808 block17_16_conv: Conv2D 140610341643544->140610341349808 140610341349808->140610340387584 140610340974720 block17_16_ac: Activation 140610340387584->140610340974720 140610340280192 conv2d_142: Conv2D 140610340974720->140610340280192 140610340976008 conv2d_141: Conv2D 140610340974720->140610340976008 140610335976416 block17_17: Lambda 140610340974720->140610335976416 140610340092392 batch_normalization_142: BatchNormalization 140610340280192->140610340092392 140610339507168 activation_142: Activation 140610340092392->140610339507168 140610339285032 conv2d_143: Conv2D 140610339507168->140610339285032 140610338755304 batch_normalization_143: BatchNormalization 140610339285032->140610338755304 140610338610088 activation_143: Activation 140610338755304->140610338610088 140610337842008 conv2d_144: Conv2D 140610338610088->140610337842008 140610340182504 batch_normalization_141: BatchNormalization 140610340976008->140610340182504 140610337922464 batch_normalization_144: BatchNormalization 140610337842008->140610337922464 140610340183848 activation_141: Activation 140610340182504->140610340183848 140610337652016 activation_144: Activation 140610337922464->140610337652016 140610336740744 block17_17_mixed: Concatenate 140610340183848->140610336740744 140610337652016->140610336740744 140610336742536 block17_17_conv: Conv2D 140610336740744->140610336742536 140610336742536->140610335976416 140610336687888 block17_17_ac: Activation 140610335976416->140610336687888 140610335470144 conv2d_146: Conv2D 140610336687888->140610335470144 140610336688000 conv2d_145: Conv2D 140610336687888->140610336688000 140610331683080 block17_18: Lambda 140610336687888->140610331683080 140610334892664 batch_normalization_146: BatchNormalization 140610335470144->140610334892664 140610334701832 activation_146: Activation 140610334892664->140610334701832 140610334202904 conv2d_147: Conv2D 140610334701832->140610334202904 140610334591968 batch_normalization_147: BatchNormalization 140610334202904->140610334591968 140610334321296 activation_147: Activation 140610334591968->140610334321296 140610333519432 conv2d_148: Conv2D 140610334321296->140610333519432 140610335886192 batch_normalization_145: BatchNormalization 140610336688000->140610335886192 140610332829512 batch_normalization_148: BatchNormalization 140610333519432->140610332829512 140610335884456 activation_145: Activation 140610335886192->140610335884456 140610333223904 activation_148: Activation 140610332829512->140610333223904 140610332428216 block17_18_mixed: Concatenate 140610335884456->140610332428216 140610333223904->140610332428216 140610332294112 block17_18_conv: Conv2D 140610332428216->140610332294112 140610332294112->140610331683080 140610331756584 block17_18_ac: Activation 140610331683080->140610331756584 140610330853272 conv2d_150: Conv2D 140610331756584->140610330853272 140610331757368 conv2d_149: Conv2D 140610331756584->140610331757368 140610326904784 block17_19: Lambda 140610331756584->140610326904784 140610330728992 batch_normalization_150: BatchNormalization 140610330853272->140610330728992 140610330078064 activation_150: Activation 140610330728992->140610330078064 140610331040288 conv2d_151: Conv2D 140610330078064->140610331040288 140610329490824 batch_normalization_151: BatchNormalization 140610331040288->140610329490824 140610329390664 activation_151: Activation 140610329490824->140610329390664 140610328589816 conv2d_152: Conv2D 140610329390664->140610328589816 140610330958200 batch_normalization_149: BatchNormalization 140610331757368->140610330958200 140610328691936 batch_normalization_152: BatchNormalization 140610328589816->140610328691936 140610331041352 activation_149: Activation 140610330958200->140610331041352 140610328428672 activation_152: Activation 140610328691936->140610328428672 140610327326616 block17_19_mixed: Concatenate 140610331041352->140610327326616 140610328428672->140610327326616 140610328025128 block17_19_conv: Conv2D 140610327326616->140610328025128 140610328025128->140610326904784 140610327446696 block17_19_ac: Activation 140610326904784->140610327446696 140610326234336 conv2d_154: Conv2D 140610327446696->140610326234336 140610327447984 conv2d_153: Conv2D 140610327446696->140610327447984 140610322451368 block17_20: Lambda 140610327446696->140610322451368 140610326201624 batch_normalization_154: BatchNormalization 140610326234336->140610326201624 140610325486504 activation_154: Activation 140610326201624->140610325486504 140610324955032 conv2d_155: Conv2D 140610325486504->140610324955032 140610325372544 batch_normalization_155: BatchNormalization 140610324955032->140610325372544 140610325107656 activation_155: Activation 140610325372544->140610325107656 140610324178536 conv2d_156: Conv2D 140610325107656->140610324178536 140610326658576 batch_normalization_153: BatchNormalization 140610327447984->140610326658576 140610323613848 batch_normalization_156: BatchNormalization 140610324178536->140610323613848 140610326656840 activation_153: Activation 140610326658576->140610326656840 140610323422400 activation_156: Activation 140610323613848->140610323422400 140610323066496 block17_20_mixed: Concatenate 140610326656840->140610323066496 140610323422400->140610323066496 140610323204360 block17_20_conv: Conv2D 140610323066496->140610323204360 140610323204360->140610322451368 140610323053256 block17_20_ac: Activation 140610322451368->140610323053256 140610318349256 conv2d_161: Conv2D 140610323053256->140610318349256 140610323054040 conv2d_157: Conv2D 140610323053256->140610323054040 140610320836088 conv2d_159: Conv2D 140610323053256->140610320836088 140610315301832 max_pooling2d_4: MaxPooling2D 140610323053256->140610315301832 140610318458384 batch_normalization_161: BatchNormalization 140610318349256->140610318458384 140610317642160 activation_161: Activation 140610318458384->140610317642160 140610317389608 conv2d_162: Conv2D 140610317642160->140610317389608 140610321818792 batch_normalization_157: BatchNormalization 140610323054040->140610321818792 140610320656984 batch_normalization_159: BatchNormalization 140610320836088->140610320656984 140610317219880 batch_normalization_162: BatchNormalization 140610317389608->140610317219880 140610321941392 activation_157: Activation 140610321818792->140610321941392 140610320250864 activation_159: Activation 140610320656984->140610320250864 140610317086056 activation_162: Activation 140610317219880->140610317086056 140610321608488 conv2d_158: Conv2D 140610321941392->140610321608488 140610319841432 conv2d_160: Conv2D 140610320250864->140610319841432 140610316273816 conv2d_163: Conv2D 140610317086056->140610316273816 140610321501712 batch_normalization_158: BatchNormalization 140610321608488->140610321501712 140610319295320 batch_normalization_160: BatchNormalization 140610319841432->140610319295320 140610316243800 batch_normalization_163: BatchNormalization 140610316273816->140610316243800 140610321239456 activation_158: Activation 140610321501712->140610321239456 140610319170584 activation_160: Activation 140610319295320->140610319170584 140610316110872 activation_163: Activation 140610316243800->140610316110872 140610315688648 mixed_7a: Concatenate 140610321239456->140610315688648 140610319170584->140610315688648 140610316110872->140610315688648 140610315301832->140610315688648 140610314351448 conv2d_165: Conv2D 140610315688648->140610314351448 140610315688816 conv2d_164: Conv2D 140610315688648->140610315688816 140610310070224 block8_1: Lambda 140610315688648->140610310070224 140610313919904 batch_normalization_165: BatchNormalization 140610314351448->140610313919904 140610313613264 activation_165: Activation 140610313919904->140610313613264 140610312855224 conv2d_166: Conv2D 140610313613264->140610312855224 140610312685496 batch_normalization_166: BatchNormalization 140610312855224->140610312685496 140610313086680 activation_166: Activation 140610312685496->140610313086680 140610312280104 conv2d_167: Conv2D 140610313086680->140610312280104 140610315136528 batch_normalization_164: BatchNormalization 140610315688816->140610315136528 140610311746280 batch_normalization_167: BatchNormalization 140610312280104->140610311746280 140610315134400 activation_164: Activation 140610315136528->140610315134400 140610311596968 activation_167: Activation 140610311746280->140610311596968 140610310500024 block8_1_mixed: Concatenate 140610315134400->140610310500024 140610311596968->140610310500024 140610310808408 block8_1_conv: Conv2D 140610310500024->140610310808408 140610310808408->140610310070224 140610310624592 block8_1_ac: Activation 140610310070224->140610310624592 140610309512720 conv2d_169: Conv2D 140610310624592->140610309512720 140610310626608 conv2d_168: Conv2D 140610310624592->140610310626608 140610305621464 block8_2: Lambda 140610310624592->140610305621464 140610309334856 batch_normalization_169: BatchNormalization 140610309512720->140610309334856 140610309156312 activation_169: Activation 140610309334856->140610309156312 140610308413032 conv2d_170: Conv2D 140610309156312->140610308413032 140610308527440 batch_normalization_170: BatchNormalization 140610308413032->140610308527440 140610308264120 activation_170: Activation 140610308527440->140610308264120 140610307161112 conv2d_171: Conv2D 140610308264120->140610307161112 140610309815880 batch_normalization_168: BatchNormalization 140610310626608->140610309815880 140610307566560 batch_normalization_171: BatchNormalization 140610307161112->140610307566560 140610309814312 activation_168: Activation 140610309815880->140610309814312 140610307295888 activation_171: Activation 140610307566560->140610307295888 140610306498120 block8_2_mixed: Concatenate 140610309814312->140610306498120 140610307295888->140610306498120 140610306060976 block8_2_conv: Conv2D 140610306498120->140610306060976 140610306060976->140610305621464 140610306202872 block8_2_ac: Activation 140610305621464->140610306202872 140610305111544 conv2d_173: Conv2D 140610306202872->140610305111544 140610306202592 conv2d_172: Conv2D 140610306202872->140610306202592 140610300811080 block8_3: Lambda 140610306202872->140610300811080 140610305197224 batch_normalization_173: BatchNormalization 140610305111544->140610305197224 140610304354120 activation_173: Activation 140610305197224->140610304354120 140610304014376 conv2d_174: Conv2D 140610304354120->140610304014376 140610303864448 batch_normalization_174: BatchNormalization 140610304014376->140610303864448 140610303459792 activation_174: Activation 140610303864448->140610303459792 140610303056248 conv2d_175: Conv2D 140610303459792->140610303056248 140610304997920 batch_normalization_172: BatchNormalization 140610306202592->140610304997920 140610302469512 batch_normalization_175: BatchNormalization 140610303056248->140610302469512 140610304999152 activation_172: Activation 140610304997920->140610304999152 140610302365256 activation_175: Activation 140610302469512->140610302365256 140610302072312 block8_3_mixed: Concatenate 140610304999152->140610302072312 140610302365256->140610302072312 140610301938208 block8_3_conv: Conv2D 140610302072312->140610301938208 140610301938208->140610300811080 140610301400568 block8_3_ac: Activation 140610300811080->140610301400568 140610300704864 conv2d_177: Conv2D 140610301400568->140610300704864 140610301399168 conv2d_176: Conv2D 140610301400568->140610301399168 140610296991528 block8_4: Lambda 140610301400568->140610296991528 140610300131480 batch_normalization_177: BatchNormalization 140610300704864->140610300131480 140610299931840 activation_177: Activation 140610300131480->140610299931840 140610299713800 conv2d_178: Conv2D 140610299931840->140610299713800 140610299167688 batch_normalization_178: BatchNormalization 140610299713800->140610299167688 140610299038856 activation_178: Activation 140610299167688->140610299038856 140610298233912 conv2d_179: Conv2D 140610299038856->140610298233912 140610300619576 batch_normalization_176: BatchNormalization 140610301399168->140610300619576 140610298343040 batch_normalization_179: BatchNormalization 140610298233912->140610298343040 140610300620472 activation_176: Activation 140610300619576->140610300620472 140610298069960 activation_179: Activation 140610298343040->140610298069960 140610297154920 block8_4_mixed: Concatenate 140610300620472->140610297154920 140610298069960->140610297154920 140610297153128 block8_4_conv: Conv2D 140610297154920->140610297153128 140610297153128->140610296991528 140610296817704 block8_4_ac: Activation 140610296991528->140610296817704 140610295890720 conv2d_181: Conv2D 140610296817704->140610295890720 140610296384704 conv2d_180: Conv2D 140610296817704->140610296384704 140610292107752 block8_5: Lambda 140610296817704->140610292107752 140610295309144 batch_normalization_181: BatchNormalization 140610295890720->140610295309144 140610295126504 activation_181: Activation 140610295309144->140610295126504 140610294623480 conv2d_182: Conv2D 140610295126504->140610294623480 140610294508736 batch_normalization_182: BatchNormalization 140610294623480->140610294508736 140610294233968 activation_182: Activation 140610294508736->140610294233968 140610293944104 conv2d_183: Conv2D 140610294233968->140610293944104 140610296298688 batch_normalization_180: BatchNormalization 140610296384704->140610296298688 140610293254184 batch_normalization_183: BatchNormalization 140610293944104->140610293254184 140610296296840 activation_180: Activation 140610296298688->140610296296840 140610293640552 activation_183: Activation 140610293254184->140610293640552 140610292844696 block8_5_mixed: Concatenate 140610296296840->140610292844696 140610293640552->140610292844696 140610292702400 block8_5_conv: Conv2D 140610292844696->140610292702400 140610292702400->140610292107752 140610292168968 block8_5_ac: Activation 140610292107752->140610292168968 140610291301512 conv2d_185: Conv2D 140610292168968->140610291301512 140610292169752 conv2d_184: Conv2D 140610292168968->140610292169752 140610287341240 block8_6: Lambda 140610292168968->140610287341240 140610291158096 batch_normalization_185: BatchNormalization 140610291301512->140610291158096 140610290351800 activation_185: Activation 140610291158096->140610290351800 140610290101608 conv2d_186: Conv2D 140610290351800->140610290101608 140610289935976 batch_normalization_186: BatchNormalization 140610290101608->140610289935976 140610289803048 activation_186: Activation 140610289935976->140610289803048 140610289002200 conv2d_187: Conv2D 140610289803048->140610289002200 140610291362392 batch_normalization_184: BatchNormalization 140610292169752->140610291362392 140610289112512 batch_normalization_187: BatchNormalization 140610289002200->140610289112512 140610291470176 activation_184: Activation 140610291362392->140610291470176 140610288840888 activation_187: Activation 140610289112512->140610288840888 140610287709320 block8_6_mixed: Concatenate 140610291470176->140610287709320 140610288840888->140610287709320 140610288437680 block8_6_conv: Conv2D 140610287709320->140610288437680 140610288437680->140610287341240 140610287871032 block8_6_ac: Activation 140610287341240->140610287871032 140610286658672 conv2d_189: Conv2D 140610287871032->140610286658672 140610287872768 conv2d_188: Conv2D 140610287871032->140610287872768 140610282888328 block8_7: Lambda 140610287871032->140610282888328 140610286593528 batch_normalization_189: BatchNormalization 140610286658672->140610286593528 140610285890696 activation_189: Activation 140610286593528->140610285890696 140610285369776 conv2d_190: Conv2D 140610285890696->140610285369776 140610285789024 batch_normalization_190: BatchNormalization 140610285369776->140610285789024 140610285520264 activation_190: Activation 140610285789024->140610285520264 140610284599112 conv2d_191: Conv2D 140610285520264->140610284599112 140610287083360 batch_normalization_188: BatchNormalization 140610287872768->140610287083360 140610284416360 batch_normalization_191: BatchNormalization 140610284599112->140610284416360 140610287081512 activation_188: Activation 140610287083360->140610287081512 140610284036672 activation_191: Activation 140610284416360->140610284036672 140610284003224 block8_7_mixed: Concatenate 140610287081512->140610284003224 140610284036672->140610284003224 140610283641712 block8_7_conv: Conv2D 140610284003224->140610283641712 140610283641712->140610282888328 140610282644368 block8_7_ac: Activation 140610282888328->140610282644368 140610282049320 conv2d_193: Conv2D 140610282644368->140610282049320 140610283445160 conv2d_192: Conv2D 140610282644368->140610283445160 140610278058640 block8_8: Lambda 140610282644368->140610278058640 140610281942768 batch_normalization_193: BatchNormalization 140610282049320->140610281942768 140610281589392 activation_193: Activation 140610281942768->140610281589392 140610281268952 conv2d_194: Conv2D 140610281589392->140610281268952 140610281105896 batch_normalization_194: BatchNormalization 140610281268952->140610281105896 140610280712064 activation_194: Activation 140610281105896->140610280712064 140610280302688 conv2d_195: Conv2D 140610280712064->140610280302688 140610282248400 batch_normalization_192: BatchNormalization 140610283445160->140610282248400 140610279748664 batch_normalization_195: BatchNormalization 140610280302688->140610279748664 140610282370160 activation_192: Activation 140610282248400->140610282370160 140610279619832 activation_195: Activation 140610279748664->140610279619832 140610279022376 block8_8_mixed: Concatenate 140610282370160->140610279022376 140610279619832->140610279022376 140610278806696 block8_8_conv: Conv2D 140610279022376->140610278806696 140610278806696->140610278058640 140610278651552 block8_8_ac: Activation 140610278058640->140610278651552 140610277938960 conv2d_197: Conv2D 140610278651552->140610277938960 140610278653568 conv2d_196: Conv2D 140610278651552->140610278653568 140610274225736 block8_9: Lambda 140610278651552->140610274225736 140610277226872 batch_normalization_197: BatchNormalization 140610277938960->140610277226872 140610277174128 activation_197: Activation 140610277226872->140610277174128 140610276444088 conv2d_198: Conv2D 140610277174128->140610276444088 140610275910264 batch_normalization_198: BatchNormalization 140610276444088->140610275910264 140610276297528 activation_198: Activation 140610275910264->140610276297528 140610275508968 conv2d_199: Conv2D 140610276297528->140610275508968 140610277855128 batch_normalization_196: BatchNormalization 140610278653568->140610277855128 140610275601712 batch_normalization_199: BatchNormalization 140610275508968->140610275601712 140610277853560 activation_196: Activation 140610277855128->140610277853560 140610275321392 activation_199: Activation 140610275601712->140610275321392 140610274401304 block8_9_mixed: Concatenate 140610277853560->140610274401304 140610275321392->140610274401304 140610274399512 block8_9_conv: Conv2D 140610274401304->140610274399512 140610274399512->140610274225736 140610273651568 block8_9_ac: Activation 140610274225736->140610273651568 140610273141200 conv2d_201: Conv2D 140610273651568->140610273141200 140610273835160 conv2d_200: Conv2D 140610273651568->140610273835160 140610268829848 block8_10: Lambda 140610273651568->140610268829848 140610272572584 batch_normalization_201: BatchNormalization 140610273141200->140610272572584 140610272385176 activation_201: Activation 140610272572584->140610272385176 140610271866776 conv2d_202: Conv2D 140610272385176->140610271866776 140610271738456 batch_normalization_202: BatchNormalization 140610271866776->140610271738456 140610270924696 activation_202: Activation 140610271738456->140610270924696 140610270670296 conv2d_203: Conv2D 140610270924696->140610270670296 140610273028976 batch_normalization_200: BatchNormalization 140610273835160->140610273028976 140610270496472 batch_normalization_203: BatchNormalization 140610270670296->140610270496472 140610273027128 activation_200: Activation 140610273028976->140610273027128 140610270374880 activation_203: Activation 140610270496472->140610270374880 140610270107464 block8_10_mixed: Concatenate 140610273027128->140610270107464 140610270374880->140610270107464 140610269977456 block8_10_conv: Conv2D 140610270107464->140610269977456 140610269977456->140610268829848 140610269410024 conv_7b: Conv2D 140610268829848->140610269410024 140610268642832 conv_7b_bn: BatchNormalization 140610269410024->140610268642832 140610268641544 conv_7b_ac: Activation 140610268642832->140610268641544 140610268730984 avg_pool: GlobalAveragePooling2D 140610268641544->140610268730984 140610268733280 predictions: Dense 140610268730984->140610268733280

In [0]:
img = image.load_img(img_path, target_size=(299, 299))

In [0]:
x = image.img_to_array(img)
x = np.expand_dims(x, axis=0)

In [0]:
from keras.applications.inception_resnet_v2 import preprocess_input as PRE_PROCESSOR
import pandas as pd
from keras.applications.inception_resnet_v2 import decode_predictions as LABEL_DECODER

In [51]:
output = class_activation_map(INPUT_IMG_FILE,
                              PRE_PROCESSOR=PRE_PROCESSOR,
                              LABEL_DECODER=LABEL_DECODER,
                              MODEL=model_InceptionResNetV2,
                              LABELS=None,
                              IM_WIDTH=299,
                              IM_HEIGHT=299,
                              EVAL_STEPS=1,
                              URL_MODE=False,
                              FILE_MODE=False,
                              HEATMAP_SHAPE=[8,8])


PREDICTION: hummingbird
WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.cast instead.
Completed processing 1 out of 1 steps in 9.6144540309906 seconds ...

In [52]:
HEATMAP = output[0]
LABEL = output[3]

plt.matshow(HEATMAP)
plt.show()
print (LABEL)


      category  probability
0  hummingbird     0.973336
1  water_ouzel     0.000573
2      jacamar     0.000501

In [0]:
heatmap_output = heatmap_overlay(INPUT_IMG_FILE,
                                 HEATMAP,
                                 THRESHOLD=0.8)
superimposed_img = heatmap_output[0]

In [0]:
output_file = './class_activation_map_InceptionResNetV2.jpeg'
cv2.imwrite(output_file, superimposed_img)

img=mpimg.imread(output_file)

In [55]:
plt.imshow(img)


Out[55]:
<matplotlib.image.AxesImage at 0x7fe25e64c978>

Part 4 -- Experimenting with different types of images

Test 1 -- Banana


In [59]:
! wget https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/banana_01.jpg -O banana_01.jpg


--2019-02-21 09:30:51--  https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/banana_01.jpg
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 80697 (79K) [image/jpeg]
Saving to: ‘banana_01.jpg’

banana_01.jpg       100%[===================>]  78.81K  --.-KB/s    in 0.01s   

2019-02-21 09:30:51 (7.33 MB/s) - ‘banana_01.jpg’ saved [80697/80697]


In [0]:
INPUT_IMG_FILE = './banana_01.jpg'

In [61]:
img=mpimg.imread(INPUT_IMG_FILE)
plt.imshow(img)


Out[61]:
<matplotlib.image.AxesImage at 0x7fe25e126320>

In [62]:
output = class_activation_map(INPUT_IMG_FILE,
                              PRE_PROCESSOR=PRE_PROCESSOR,
                              LABEL_DECODER=LABEL_DECODER,
                              MODEL=model_InceptionResNetV2,
                              LABELS=None,
                              IM_WIDTH=299,
                              IM_HEIGHT=299,
                              EVAL_STEPS=1,
                              URL_MODE=False,
                              FILE_MODE=False,
                              HEATMAP_SHAPE=[8,8])


PREDICTION: banana
Completed processing 1 out of 1 steps in 4.7812793254852295 seconds ...

In [63]:
HEATMAP = output[0]
LABEL = output[3]

plt.matshow(HEATMAP)
plt.show()
print (LABEL)


        category  probability
0         banana     0.868550
1         orange     0.006009
2  grocery_store     0.001587

In [0]:
heatmap_output = heatmap_overlay(INPUT_IMG_FILE,
                                 HEATMAP,
                                 THRESHOLD=0.8)
superimposed_img = heatmap_output[0]

In [0]:
output_file = './class_activation_map_InceptionResNetV2.jpeg'
cv2.imwrite(output_file, superimposed_img)

img=mpimg.imread(output_file)

In [66]:
plt.imshow(img)


Out[66]:
<matplotlib.image.AxesImage at 0x7fe25df8cb38>

In [67]:
import seaborn as sns
f = sns.barplot(x='probability',y='category',data=LABEL,color="red")
sns.set_style(style='white')
f.grid(False)
f.spines["top"].set_visible(False)
f.spines["right"].set_visible(False)
f.spines["bottom"].set_visible(False)
f.spines["left"].set_visible(False)
f.set_title('Top 3 Predictions:')


/usr/local/lib/python3.6/dist-packages/seaborn/categorical.py:1428: FutureWarning: remove_na is deprecated and is a private function. Do not use.
  stat_data = remove_na(group_data)
Out[67]:
Text(0.5, 1.0, 'Top 3 Predictions:')

Test 2 -- Banjo player


In [68]:
! wget https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/banjo_player_01.jpg -O banjo_player_01.jpg


--2019-02-21 09:31:30--  https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/banjo_player_01.jpg
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 7797 (7.6K) [image/jpeg]
Saving to: ‘banjo_player_01.jpg’

banjo_player_01.jpg 100%[===================>]   7.61K  --.-KB/s    in 0s      

2019-02-21 09:31:30 (79.7 MB/s) - ‘banjo_player_01.jpg’ saved [7797/7797]


In [0]:
INPUT_IMG_FILE = './banjo_player_01.jpg'

In [70]:
img=mpimg.imread(INPUT_IMG_FILE)
plt.imshow(img)


Out[70]:
<matplotlib.image.AxesImage at 0x7fe25def0b00>

In [71]:
output = class_activation_map(INPUT_IMG_FILE,
                              PRE_PROCESSOR=PRE_PROCESSOR,
                              LABEL_DECODER=LABEL_DECODER,
                              MODEL=model_InceptionResNetV2,
                              LABELS=None,
                              IM_WIDTH=299,
                              IM_HEIGHT=299,
                              EVAL_STEPS=1,
                              URL_MODE=False,
                              FILE_MODE=False,
                              HEATMAP_SHAPE=[8,8])


PREDICTION: banjo
Completed processing 1 out of 1 steps in 4.5725181102752686 seconds ...

In [72]:
HEATMAP = output[0]
LABEL = output[3]

plt.matshow(HEATMAP)
plt.show()
print (LABEL)


      category  probability
0        banjo     0.933209
1  toilet_seat     0.000398
2         drum     0.000398

In [0]:
heatmap_output = heatmap_overlay(INPUT_IMG_FILE,
                                 HEATMAP,
                                 THRESHOLD=0.8)
superimposed_img = heatmap_output[0]

In [0]:
output_file = './class_activation_map_InceptionResNetV2.jpeg'
cv2.imwrite(output_file, superimposed_img)

img=mpimg.imread(output_file)

In [75]:
plt.imshow(img)


Out[75]:
<matplotlib.image.AxesImage at 0x7fe25dd77860>

In [76]:
import seaborn as sns
f = sns.barplot(x='probability',y='category',data=LABEL,color="red")
sns.set_style(style='white')
f.grid(False)
f.spines["top"].set_visible(False)
f.spines["right"].set_visible(False)
f.spines["bottom"].set_visible(False)
f.spines["left"].set_visible(False)
f.set_title('Top 3 Predictions:')


/usr/local/lib/python3.6/dist-packages/seaborn/categorical.py:1428: FutureWarning: remove_na is deprecated and is a private function. Do not use.
  stat_data = remove_na(group_data)
Out[76]:
Text(0.5, 1.0, 'Top 3 Predictions:')

Test 3 -- Yet another banjo player


In [77]:
! wget https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/banjo_player_02.jpg -O banjo_player_02.jpg


--2019-02-21 09:32:04--  https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/banjo_player_02.jpg
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 1073613 (1.0M) [image/jpeg]
Saving to: ‘banjo_player_02.jpg’

banjo_player_02.jpg 100%[===================>]   1.02M  --.-KB/s    in 0.03s   

2019-02-21 09:32:04 (29.8 MB/s) - ‘banjo_player_02.jpg’ saved [1073613/1073613]


In [0]:
INPUT_IMG_FILE = './banjo_player_02.jpg'

In [79]:
img=mpimg.imread(INPUT_IMG_FILE)
plt.imshow(img)


Out[79]:
<matplotlib.image.AxesImage at 0x7fe25dca3eb8>

In [80]:
output = class_activation_map(INPUT_IMG_FILE,
                              PRE_PROCESSOR=PRE_PROCESSOR,
                              LABEL_DECODER=LABEL_DECODER,
                              MODEL=model_InceptionResNetV2,
                              LABELS=None,
                              IM_WIDTH=299,
                              IM_HEIGHT=299,
                              EVAL_STEPS=1,
                              URL_MODE=False,
                              FILE_MODE=False,
                              HEATMAP_SHAPE=[8,8])


PREDICTION: banjo
Completed processing 1 out of 1 steps in 4.621196985244751 seconds ...

In [81]:
HEATMAP = output[0]
LABEL = output[3]

plt.matshow(HEATMAP)
plt.show()
print (LABEL)


      category  probability
0        banjo     0.956864
1  toilet_seat     0.000349
2     strainer     0.000300

In [0]:
heatmap_output = heatmap_overlay(INPUT_IMG_FILE,
                                 HEATMAP,
                                 THRESHOLD=0.8)
superimposed_img = heatmap_output[0]

In [0]:
output_file = './class_activation_map_InceptionResNetV2.jpeg'
cv2.imwrite(output_file, superimposed_img)

img=mpimg.imread(output_file)

In [84]:
plt.imshow(img)


Out[84]:
<matplotlib.image.AxesImage at 0x7fe25dae4cf8>

In [85]:
import seaborn as sns
f = sns.barplot(x='probability',y='category',data=LABEL,color="red")
sns.set_style(style='white')
f.grid(False)
f.spines["top"].set_visible(False)
f.spines["right"].set_visible(False)
f.spines["bottom"].set_visible(False)
f.spines["left"].set_visible(False)
f.set_title('Top 3 Predictions:')


/usr/local/lib/python3.6/dist-packages/seaborn/categorical.py:1428: FutureWarning: remove_na is deprecated and is a private function. Do not use.
  stat_data = remove_na(group_data)
Out[85]:
Text(0.5, 1.0, 'Top 3 Predictions:')

Test 4 -- Throne


In [86]:
! wget https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/throne_01.jpg -O throne_01.jpg


--2019-02-21 09:32:38--  https://raw.githubusercontent.com/rahulremanan/python_tutorial/master/Machine_Vision/02_Object_Prediction/test_images/throne_01.jpg
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 768540 (751K) [image/jpeg]
Saving to: ‘throne_01.jpg’

throne_01.jpg       100%[===================>] 750.53K  --.-KB/s    in 0.03s   

2019-02-21 09:32:38 (26.0 MB/s) - ‘throne_01.jpg’ saved [768540/768540]


In [0]:
INPUT_IMG_FILE = './throne_01.jpg'

In [88]:
img=mpimg.imread(INPUT_IMG_FILE)
plt.imshow(img)


Out[88]:
<matplotlib.image.AxesImage at 0x7fe25da19390>

In [89]:
output = class_activation_map(INPUT_IMG_FILE,
                              PRE_PROCESSOR=PRE_PROCESSOR,
                              LABEL_DECODER=LABEL_DECODER,
                              MODEL=model_InceptionResNetV2,
                              LABELS=None,
                              IM_WIDTH=299,
                              IM_HEIGHT=299,
                              EVAL_STEPS=1,
                              URL_MODE=False,
                              FILE_MODE=False,
                              HEATMAP_SHAPE=[8,8])


PREDICTION: throne
Completed processing 1 out of 1 steps in 4.443957090377808 seconds ...

In [90]:
HEATMAP = output[0]
LABEL = output[3]

plt.matshow(HEATMAP)
plt.show()
print (LABEL)


      category  probability
0       throne     0.901864
1  four-poster     0.014415
2        altar     0.002789

In [0]:
heatmap_output = heatmap_overlay(INPUT_IMG_FILE,
                                 HEATMAP,
                                 THRESHOLD=0.8)
superimposed_img = heatmap_output[0]

In [0]:
output_file = './class_activation_map_InceptionResNetV2.jpeg'
cv2.imwrite(output_file, superimposed_img)

img=mpimg.imread(output_file)

In [93]:
plt.imshow(img)


Out[93]:
<matplotlib.image.AxesImage at 0x7fe25d8a0e48>