In [1]:
import graphlab as gl
from IPython.display import Image
## Set a variable for the path to your (shared) data and/or model directory
path_to_dir = '../'

Building an Image Similarity Service


Who doesn't love dresses? Dresses can be incredibly diverse in terms of look, fit, feel, material, trendiness, and quality. They help us feel good, look good, and express ourselves. Needless to say, buying and selling dresses is a big deal. Depending on who you are, what you need, what you want, and what you can afford, this can be really hard. Sometimes it's just hard because you like everything and need to make a decision.

In the tutorial demo, we saw an end-to-end application that finds similar items based on text and image metadata. We're going to focus on one of the core parts of building such an application, Image Similarity, that uses transfer learning and nearest neighbours to extract and compare features on images and determine how similar they are.

This iPython notebook consists of the following parts:

  1. Loading the Data - loading existing data into an SFrame.
  2. Extracing the features - loading an existing pre-trained model into a model object and using it to extract features.
  3. Calculating the Distance - Creating and training a k-nearest neighbors classification model on the extracted features.
  4. Finding Similar Items - Using the k-nearest neighbors model to help us find similar items.
  5. Saving our new model for use in deploying a predictive service! (This is covered in a separate notebook)


In [2]:
## Creating the SFrame with our path to the directory where it is saved.
image_sf = gl.SFrame(path_to_dir + 
                     'sf_processed.sframe'
                    )

image_sf
image_sf.show()  #Explore the data using Canvas visual explorer


[INFO] This commercial license of GraphLab Create is assigned to engr@turi.com.

[INFO] Start server at: ipc:///tmp/graphlab_server-99223 - Server binary: /Users/piotrteterwak/.virtualenvs/gl-1.4/lib/python2.7/site-packages/graphlab/unity_server - Server log: /tmp/graphlab_server_1442965949.log
[INFO] GraphLab Server Version: 1.6
Canvas is accessible via web browser at the URL: http://localhost:64302/index.html
Opening Canvas in default web browser.

In [4]:
pretrained_model = gl.load_model(path_to_dir + 
                                 'pretrained_model.gl')

In [5]:
## Let's take a look at the network topology of hte pretrained model
pretrained_model['network']


Out[5]:
### network layers ###
layer[0]: ConvolutionLayer
  init_random = gaussian
  padding = 0
  stride = 4
  num_channels = 96
  num_groups = 1
  kernel_size = 11
layer[1]: RectifiedLinearLayer
layer[2]: MaxPoolingLayer
  padding = 0
  stride = 2
  kernel_size = 3
layer[3]: LocalResponseNormalizationLayer
  alpha = 0.001
  beta = 0.75
  knorm = 1
  local_size = 5
layer[4]: ConvolutionLayer
  init_random = gaussian
  padding = 2
  stride = 1
  num_channels = 256
  num_groups = 2
  kernel_size = 5
layer[5]: RectifiedLinearLayer
layer[6]: MaxPoolingLayer
  padding = 0
  stride = 2
  kernel_size = 3
layer[7]: LocalResponseNormalizationLayer
  alpha = 0.001
  beta = 0.75
  knorm = 1
  local_size = 5
layer[8]: ConvolutionLayer
  init_random = gaussian
  padding = 1
  stride = 1
  num_channels = 384
  num_groups = 1
  kernel_size = 3
layer[9]: RectifiedLinearLayer
layer[10]: ConvolutionLayer
  init_random = gaussian
  padding = 1
  stride = 1
  num_channels = 384
  num_groups = 2
  kernel_size = 3
layer[11]: RectifiedLinearLayer
layer[12]: ConvolutionLayer
  init_random = gaussian
  padding = 1
  stride = 1
  init_bias = 1.0
  num_channels = 256
  num_groups = 2
  kernel_size = 3
layer[13]: RectifiedLinearLayer
layer[14]: MaxPoolingLayer
  padding = 0
  stride = 2
  kernel_size = 3
layer[15]: FlattenLayer
layer[16]: FullConnectionLayer
  init_sigma = 0.005
  init_random = gaussian
  init_bias = 1.0
  num_hidden_units = 4096
layer[17]: RectifiedLinearLayer
layer[18]: DropoutLayer
  threshold = 0.5
layer[19]: FullConnectionLayer
  init_sigma = 0.005
  init_random = gaussian
  init_bias = 1.0
  num_hidden_units = 4096
layer[20]: RectifiedLinearLayer
layer[21]: DropoutLayer
  threshold = 0.5
layer[22]: FullConnectionLayer
  init_sigma = 0.01
  init_random = gaussian
  init_bias = 0
  num_hidden_units = 1000
layer[23]: SoftmaxLayer
### end network layers ###

### network parameters ###
model_checkpoint_path = /data/imagenet/result/model_checkpoint
learning_rate = 0.01
random_type = gaussian
input_shape = 3,227,227
learning_rate_gamma = 0.1
bias_learning_rate = 0.02
model_checkpoint_interval = 5
random_mirror = 1
batch_size = 200
learning_rate_step = 100000
random_crop = 1
learning_rate_schedule = exponential_decay
l2_regularization = 0.0005
metric = accuracy,recall@5
momentum = 0.9
init_sigma = 0.01
### end network parameters ###

Step 2: Extract Features

We will be using deep visual features to match the product images to each other. In order to do that, we need to load in our pre-trained ImageNet neural network model to be used as a feature extractor, and extract features from the images in the dataset.

As an example extract the features of the first image in the dataset. The extract_features takes the output from the second last layer of the pretrained_model, before the classification layer discards the rich features the network has learned.

We also observe the size of the features extracted, compared with the original image:

  • 196608: bytes values per 256x256pixel image with an RGB component.
  • 4096: size of the penultimate fully-connected layer and hence the # of features 4096.


In [6]:
image_sf['image'][:1].show()


Canvas is updated and available in a tab in the default browser.

In [7]:
# Here are the features of the first image
extracted = pretrained_model.extract_features(image_sf[['image']][:1])
# NB: image_sf is an SFrame, image_sf['image'] is an SArray, and image_sf[['image']] is an SFrame
extracted


Out[7]:
dtype: array
Rows: 1
[array('d', [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 5.0582170486450195, 0.0, 0.0, 0.0, 0.0, 0.0, 2.8189637660980225, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7150125503540039, 0.0, 0.0, 0.7394068241119385, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.7938714027404785, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.4930804967880249, 0.0, 0.40331411361694336, 0.0, 0.0, 0.0, 0.0, 1.3235752582550049, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.355628490447998, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.43977952003479, 0.0, 1.9633976221084595, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.4535653591156006, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.6498310565948486, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.248290777206421, 0.0, 0.0, 0.0, 3.680190086364746, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.08320415019989014, 0.0, 0.0, 0.0, 0.0, 0.0, 1.9655675888061523, 0.0, 0.0, 0.0, 0.0, 2.5746583938598633, 0.0, 0.0, 0.0, 0.0, 2.1913700103759766, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.141339063644409, 0.0, 0.0, 0.0, 0.0, 0.3872112035751343, 0.7241973876953125, 0.0, 0.0, 4.62368631362915, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.45201820135116577, 0.0, 0.0, 3.18131685256958, 0.0, 0.4173318147659302, 0.0, 0.0, 0.0, 0.0, 0.8028323650360107, 2.3898766040802, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.5731427669525146, 0.0, 0.26010191440582275, 1.365661859512329, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7527472972869873, 0.0, 0.0, 0.0, 0.9688431024551392, 0.0, 0.0, 0.0, 0.0, 2.385343074798584, 0.0, 0.0, 1.175279140472412, 0.0, 0.0, 1.230550765991211, 0.0, 3.3501245975494385, 0.0, 0.3405788540840149, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.2410664558410645, 3.252714157104492, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8606007099151611, 0.0, 0.0, 0.0, 0.0, 0.0, 1.2926876544952393, 0.6970643401145935, 1.9555680751800537, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.24955403804779053, 0.507768988609314, 0.0, 0.0, 2.6871089935302734, 7.2013325691223145, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.6849899291992188, 0.0, 0.0, 0.6870303153991699, 0.8225374817848206, 0.0, 0.0, 0.0, 2.278080463409424, 0.0, 0.0, 0.002571582794189453, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.623734712600708, 0.0, 0.0, 0.6847622394561768, 0.0, 0.0, 0.0, 0.0, 0.0, 0.09402740001678467, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.4062747955322266, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.41403669118881226, 0.8189364671707153, 0.0, 0.0, 0.0, 0.0, 1.5342092514038086, 0.0, 0.5225511789321899, 0.0, 0.32294195890426636, 0.0, 0.0, 0.0, 0.0, 2.7377049922943115, 0.0, 0.0, 1.2170634269714355, 2.938838243484497, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0959328413009644, 0.0, 0.4374886751174927, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.8955721855163574, 0.0, 0.0, 0.0, 0.9233564138412476, 0.0, 0.0, 3.141129732131958, 0.8484493494033813, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.3562090396881104, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.820507764816284, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.37714076042175293, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.21522492170333862, 0.0, 0.02302795648574829, 0.0, 0.2557913661003113, 0.0, 0.0, 0.0, 1.0163261890411377, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.992156982421875, 0.0, 0.0, 0.0, 1.78342866897583, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.1402114629745483, 0.0, 0.0, 0.0, 0.0, 0.0, 4.687260627746582, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.17816275358200073, 1.4615427255630493, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8830375075340271, 0.8379859328269958, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.546508312225342, 1.4216848611831665, 0.0, 0.0, 0.11119961738586426, 0.0, 0.1472986936569214, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.54062819480896, 0.0, 2.575019121170044, 0.0, 0.0, 0.0, 3.4327852725982666, 0.0, 0.6801859140396118, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.2091611623764038, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8122560977935791, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2644112706184387, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.40397220849990845, 0.0, 3.475497245788574, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.4766901731491089, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.573518991470337, 1.3157126903533936, 2.8261971473693848, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.3014185428619385, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 5.86806583404541, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.4893088340759277, 0.0, 3.4329781532287598, 0.0, 0.713085949420929, 0.0, 2.688925266265869, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.6634880304336548, 0.0, 0.0, 1.5468394756317139, 2.666332244873047, 0.0, 0.0, 0.0, 0.0, 0.0, 1.5402036905288696, 0.0, 0.0, 0.0, 0.0, 0.0, 0.923401415348053, 0.0, 0.4883907735347748, 0.46217483282089233, 0.0, 0.0, 2.3881633281707764, 1.180586814880371, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1603764295578003, 0.0, 0.0, 3.909684658050537, 0.0, 0.0, 0.0, 0.8867034316062927, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.8727203607559204, 1.095210075378418, 0.0, 0.0, 0.0, 0.0, 0.06960678100585938, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7240211963653564, 0.0, 0.18278425931930542, 3.991037130355835, 0.0, 0.0, 0.0, 1.859442949295044, 0.0, 0.9682266116142273, 0.0, 0.0, 0.08038914203643799, 0.0, 0.0, 0.0, 0.0, 0.0, 0.19108694791793823, 0.0, 3.5663318634033203, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.007696926593780518, 0.0, 0.0, 0.0, 0.0, 0.06517273187637329, 0.0, 0.0, 0.0, 1.1938523054122925, 0.0, 0.0, 0.0, 1.6577699184417725, 0.1800404191017151, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8408445715904236, 1.1485074758529663, 0.0, 2.5247387886047363, 0.346602201461792, 0.0, 0.0, 0.0, 0.0, 0.0, 0.9494279026985168, 0.0, 0.9948981404304504, 4.064096450805664, 0.0, 0.0, 0.9229902029037476, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2752549648284912, 0.0, 1.1936497688293457, 0.48911750316619873, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.653761863708496, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.6348448991775513, 0.0, 0.0, 0.0, 0.9343314170837402, 0.218930184841156, 0.02627706527709961, 0.0, 0.0, 0.0, 0.12542420625686646, 0.0, 0.0, 0.0, 0.592560887336731, 0.7134367227554321, 3.445565938949585, 0.0, 0.0, 0.0, 0.0, 0.0, 1.1732879877090454, 0.0, 0.0, 0.0, 0.0, 0.0, 3.947272777557373, 0.0, 0.0, 3.8534412384033203, 0.0, 1.8455735445022583, 0.0, 0.0, 1.7678189277648926, 0.0, 4.514519691467285, 0.0, 0.0, 0.0, 0.0, 3.1988017559051514, 0.0, 1.2795207500457764, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 6.611478805541992, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7207418084144592, 0.0, 0.0, 0.0, 0.0, 2.1472275257110596, 0.0, 5.8998026847839355, 0.0, 0.0, 0.024957239627838135, 0.0, 0.0, 0.0, 0.0, 3.1803488731384277, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3210155963897705, 0.0, 0.0, 2.2360146045684814, 0.0, 0.0, 0.0, 0.0, 0.0, 0.016510844230651855, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.883721113204956, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.8818421363830566, 0.0, 0.0, 0.6482197046279907, 0.0, 0.0, 0.0, 0.0, 0.6032576560974121, 0.0, 0.294715940952301, 0.0, 0.0, 2.9288907051086426, 0.9835105538368225, 0.0, 0.44834595918655396, 0.0, 0.8002597093582153, 1.4286823272705078, 3.1790614128112793, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.6830536127090454, 0.0, 0.0, 0.0, 0.22806984186172485, 0.0, 4.390357971191406, 0.0, 1.2836666107177734, 0.0, 0.0, 0.0, 0.0, 0.0, 1.2967067956924438, 0.9244845509529114, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.982556104660034, 0.0, 0.6003455519676208, 2.3619205951690674, 0.0, 0.0, 0.0, 2.9337992668151855, 0.2494831681251526, 0.8430817723274231, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.968580961227417, 1.9043396711349487, 0.0, 0.0, 0.0, 0.36232268810272217, 0.0, 0.0, 0.0, 1.7045869827270508, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1441655158996582, 4.001467704772949, 1.1929601430892944, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.070071697235107, 0.0, 0.0, 0.0, 0.0, 0.5303606986999512, 0.0, 1.8382527828216553, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.401667594909668, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.9912731051445007, 1.3606717586517334, 4.250411033630371, 0.0, 0.0, 1.813401222229004, 3.2112550735473633, 0.16770654916763306, 0.0, 0.0, 1.7144875526428223, 0.0, 0.0, 1.8340795040130615, 0.0, 0.0, 2.3780062198638916, 0.0, 0.0, 0.0, 1.9273982048034668, 0.0, 0.0, 0.0, 0.0, 4.357573509216309, 0.07580214738845825, 0.0, 0.0, 0.0, 0.0, 0.0, 1.727752447128296, 0.0, 0.0, 0.9869104027748108, 0.05813330411911011, 0.0, 4.056509494781494, 0.0, 0.0, 0.0, 0.6663181781768799, 0.0, 2.831491470336914, 0.24352365732192993, 2.0500566959381104, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.08726245164871216, 0.0, 0.5369541049003601, 0.0, 0.0, 0.2351575493812561, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.110135555267334, 0.0, 0.0, 0.0, 4.816840648651123, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.190485954284668, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.4912277460098267, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.5338876247406006, 0.0, 0.0, 0.0, 1.5910509824752808, 4.511062145233154, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.060845136642456, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0600773096084595, 0.0, 0.0, 0.0, 0.0, 0.0, 1.935929298400879, 0.0, 0.0, 1.8068437576293945, 0.0, 0.0, 0.0, 0.0, 0.0, 2.9564218521118164, 0.0, 0.16748332977294922, 0.8989858031272888, 0.0, 0.0, 1.0361534357070923, 0.0, 0.0, 0.0, 0.0, 0.35105788707733154, 0.0, 0.0, 0.0, 0.9472790956497192, 0.0, 0.0, 0.0, 0.0, 0.0, 3.0467441082000732, 0.5349177122116089, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.468510150909424, 0.0, 0.0, 0.10836124420166016, 0.0, 0.38425177335739136, 0.0, 0.0, 0.0, 0.0, 0.5520187616348267, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.218446731567383, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.2163012027740479, 0.5239739418029785, 0.0, 0.0, 0.0, 0.0, 2.594062566757202, 0.0, 2.0161123275756836, 0.014375627040863037, 0.0, 0.0, 0.0, 1.9633328914642334, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.1132663488388062, 0.0, 0.0, 0.0, 0.0, 0.0, 2.5307741165161133, 0.0, 1.8431365489959717, 0.0, 0.0, 0.18958234786987305, 0.0, 0.5081480741500854, 2.1913535594940186, 0.0, 3.491729974746704, 0.0, 0.0, 3.310110569000244, 0.0, 0.0, 1.1694470643997192, 0.0, 0.0, 0.0, 0.0, 2.6551220417022705, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.009976804256439209, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.8906850814819336, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.9018874168395996, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7325394153594971, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.0497589111328125, 0.19528263807296753, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.5421628952026367, 1.5471539497375488, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.2185113430023193, 0.21779221296310425, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.551121711730957, 0.0, 0.10967522859573364, 0.0, 0.0, 3.557755947113037, 0.0, 0.0, 1.6853843927383423, 0.0, 3.1250641345977783, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.0795655250549316, 0.7931410074234009, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.147986650466919, 0.0, 0.0, 0.0, 0.0, 0.6819983720779419, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0952574014663696, 1.1658421754837036, 0.0, 0.0, 0.0, 3.11897873878479, 0.0, 0.0, 0.0, 0.0, 0.0, 0.32130521535873413, 0.0, 0.0, 2.313304901123047, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.6720740795135498, 0.0, 0.0, 0.0, 0.0, 0.0, 1.148186445236206, 0.0, 2.0801618099212646, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2541583776473999, 1.6649057865142822, 1.9655659198760986, 0.0, 0.0, 0.0, 0.39298200607299805, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7260769605636597, 1.2686522006988525, 2.329824209213257, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8296180963516235, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.3684415817260742, 0.0, 0.6934060454368591, 0.0, 0.0, 0.0, 1.0131869316101074, 0.0, 1.5393842458724976, 0.0, 0.0, 0.0, 0.0, 0.0, 0.44677090644836426, 0.0, 0.0, 0.0, 0.0, 0.0, 0.9235547184944153, 0.0, 0.0, 0.0, 2.320772647857666, 0.0, 0.0, 0.0, 3.5808005332946777, 0.0, 0.0, 1.9515089988708496, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8385280966758728, 0.0, 0.0, 0.0, 0.0, 0.0, 4.434026718139648, 0.0, 0.0, 0.0, 0.0, 0.0, 0.527511715888977, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.3615615367889404, 0.0, 0.0, 0.7795660495758057, 4.613838195800781, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.9205001592636108, 0.0, 0.0, 0.0, 0.0, 4.349380970001221, 0.0, 0.0, 0.0, 0.1285516619682312, 3.414609909057617, 0.0, 1.2090864181518555, 0.0, 0.0, 0.35465890169143677, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.579057216644287, 0.0, 1.4293879270553589, 5.158079624176025, 0.0, 0.0, 2.923259973526001, 0.0, 0.0, 0.0, 2.190636396408081, 0.0, 1.5443212985992432, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5000529289245605, 2.587780475616455, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.6245059967041016, 0.7621824741363525, 0.0, 0.0, 0.0, 0.5479074120521545, 0.0, 1.1548807621002197, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.871946334838867, 0.0, 2.1023664474487305, 0.0, 0.0, 0.0, 0.0, 0.0, 3.078902006149292, 7.370853424072266, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 5.277357578277588, 0.8757790327072144, 1.0458472967147827, 0.0, 2.0205326080322266, 0.0, 0.1507643461227417, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.85890007019043, 0.0, 0.0, 1.0012587308883667, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.10504806041717529, 0.0, 0.0, 0.9165911674499512, 0.0, 0.0, 0.935358464717865, 0.0, 0.0, 1.608163833618164, 0.0, 0.8791738748550415, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.5419597625732422, 0.0, 0.0, 0.0, 0.23680561780929565, 0.0, 2.8044042587280273, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.4105257987976074, 0.0, 0.0, 3.9846324920654297, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.8073856830596924, 0.0169258713722229, 0.0, 0.0, 0.0, 3.223792314529419, 0.0, 0.0, 0.0, 1.0484896898269653, 0.0, 0.0, 0.25179827213287354, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.036999225616455, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0535027980804443, 1.147221326828003, 0.0, 0.9710681438446045, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.3336167335510254, 0.0, 0.5301799178123474, 0.0058026909828186035, 0.0, 2.2799577713012695, 0.0, 0.20579171180725098, 0.0, 0.8566467761993408, 0.3617597818374634, 0.8159303069114685, 3.321990489959717, 3.334052085876465, 2.8698272705078125, 0.0, 0.0, 0.0, 2.3044657707214355, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.25547176599502563, 0.7950807809829712, 0.0, 1.2440204620361328, 0.0, 0.0, 1.8081254959106445, 2.1630349159240723, 0.0, 0.0, 0.0, 0.0, 0.0, 0.22434866428375244, 0.0, 0.0, 0.0, 2.917583703994751, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3565129041671753, 0.0, 0.0, 0.0, 0.0, 0.0, 1.299020767211914, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.29825806617736816, 0.0, 0.0, 0.0, 2.6589651107788086, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8534784913063049, 0.08757233619689941, 0.0, 0.0, 0.0, 3.9850292205810547, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.468726634979248, 0.6855865716934204, 0.0, 2.8803508281707764, 0.0, 0.22449064254760742, 0.0, 0.0, 0.0, 0.0, 4.38340425491333, 0.0, 4.667736530303955, 0.0, 0.17183083295822144, 0.0, 0.6357376575469971, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3342987895011902, 0.0, 0.0, 0.6165604591369629, 0.0, 0.0, 0.804053544998169, 0.0, 0.0, 0.0, 0.0, 1.5824685096740723, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.9535064697265625, 0.0, 0.0, 0.0, 0.0, 0.0, 2.053340435028076, 0.17965924739837646, 0.07797104120254517, 0.0, 0.30834025144577026, 0.0, 0.0, 0.0, 0.0, 0.5385817289352417, 0.0, 3.2902672290802, 0.0, 0.0, 0.0, 0.0, 1.9806640148162842, 0.0, 0.0, 0.0, 2.6047699451446533, 0.0, 0.0, 0.0, 5.1333746910095215, 0.0, 0.0, 2.047464370727539, 0.0, 1.442402958869934, 0.26934677362442017, 1.17965829372406, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7876006364822388, 1.830918550491333, 0.0, 0.0, 2.744903564453125, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.7004008293151855, 1.1800247430801392, 0.25432080030441284, 1.6052972078323364, 0.0, 0.07802462577819824, 0.0, 0.0, 0.0, 1.3730870485305786, 0.0, 0.0, 0.0, 0.0, 0.0, 0.6932621598243713, 0.0, 0.0, 0.0, 0.0, 1.2852623462677002, 0.0, 0.0, 0.13557612895965576, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.15472137928009033, 0.0, 0.0, 0.0, 0.0, 1.2558143138885498, 0.0, 1.7469621896743774, 0.0, 0.0, 0.0, 0.7083309888839722, 0.0, 0.0, 0.0, 0.0, 0.0, 0.45387113094329834, 0.0, 0.8952315449714661, 0.9081637859344482, 1.5522270202636719, 1.5848771333694458, 2.310703754425049, 0.3540106415748596, 0.0, 0.0, 0.0, 0.0, 0.0, 2.116093158721924, 0.0, 0.0, 0.0, 0.0, 0.23059052228927612, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.8990631103515625, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3248802423477173, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.1155574321746826, 0.0, 0.0, 1.0351547002792358, 0.0, 0.0, 0.0, 0.0, 1.2227416038513184, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5225957632064819, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.3740475177764893, 0.0, 0.0, 0.0, 0.0, 1.2292271852493286, 0.0, 0.0, 1.1282886266708374, 1.7682487964630127, 3.8387110233306885, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.8077771663665771, 1.8848837614059448, 0.0, 0.4300013780593872, 0.0, 0.0, 0.6775720715522766, 0.0, 0.017988264560699463, 0.0, 1.6263483762741089, 0.5783725380897522, 0.7083770036697388, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.8065452575683594, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3916711211204529, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.3274562358856201, 0.3257870674133301, 0.1983601450920105, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.1327593326568604, 0.5148743391036987, 0.0, 0.0, 0.0, 0.0, 3.184417247772217, 1.5542395114898682, 0.0, 0.0, 0.67621910572052, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0695086121559143, 0.0, 4.357614040374756, 0.0, 0.0, 0.19334501028060913, 0.0, 0.0, 0.0, 0.0, 0.0, 4.62748908996582, 0.0, 1.586611032485962, 1.176072597503662, 1.288231611251831, 2.6475706100463867, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.19419527053833, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.7364046573638916, 0.0, 2.8783762454986572, 0.0, 0.0, 0.8799570202827454, 0.0, 0.0, 0.0, 0.9734141826629639, 0.0, 3.5458106994628906, 0.0, 0.0, 0.0, 0.0, 0.0, 2.238978624343872, 0.0, 0.0, 0.0, 3.9519309997558594, 0.0, 0.0, 0.0, 1.4500248432159424, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.026718258857727, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.11662840843200684, 0.0, 0.0, 0.12730824947357178, 0.0, 0.0, 1.3626717329025269, 0.0, 0.0, 1.64488685131073, 0.0, 0.945004940032959, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.341313123703003, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7456681728363037, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.1767094135284424, 0.5840926766395569, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.279904365539551, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.2811555862426758, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8730319142341614, 4.6344194412231445, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.6607314348220825, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.7126729488372803, 0.0, 0.0, 1.5001487731933594, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.5117542743682861, 0.0017873048782348633, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.5965735912323, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.8762906789779663, 0.0, 1.96806001663208, 0.0, 0.0, 0.0, 1.4140459299087524, 1.6221210956573486, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.15051960945129395, 0.0, 0.0, 0.0, 0.0, 1.842442512512207, 0.0, 0.0, 0.0, 0.0, 0.0, 3.1614770889282227, 0.0, 3.476168632507324, 4.024957656860352, 0.0, 1.0240252017974854, 0.0, 0.0, 0.0, 0.2793390154838562, 0.0, 0.0, 0.0, 2.348482131958008, 0.0, 1.6225767135620117, 0.9189393520355225, 0.0, 0.0, 0.0, 1.409437656402588, 0.0, 0.0, 0.0, 0.0, 0.0, 3.256230354309082, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.7507195472717285, 0.039553046226501465, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.10228508710861206, 0.0, 3.734164237976074, 1.2873079776763916, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 5.251305103302002, 0.0, 0.0, 0.0, 0.0, 0.0, 4.593714714050293, 0.0, 0.0, 0.0, 1.7604343891143799, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.013613700866699219, 1.8494305610656738, 0.0, 0.0, 2.209419012069702, 0.7927732467651367, 0.09167104959487915, 0.01727616786956787, 0.0, 0.0, 0.0, 1.4262025356292725, 0.0, 0.0, 1.5401420593261719, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.28971946239471436, 0.0, 0.0, 3.007366180419922, 1.152816891670227, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.23387932777404785, 0.0, 0.0, 0.0, 1.4974181652069092, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.13992130756378174, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.021028518676757812, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.085939884185791, 0.0, 1.6475903987884521, 0.0, 0.5451744794845581, 0.10539263486862183, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.1182966232299805, 0.0, 0.0, 0.0, 0.0, 0.3577491044998169, 0.0, 0.37824684381484985, 0.0, 0.0, 1.3385992050170898, 0.0, 0.0, 0.44659024477005005, 0.0, 0.0, 0.0, 0.0, 0.0, 3.9265177249908447, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.6888657212257385, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.9102585315704346, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.11561346054077148, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.0114965438842773, 0.0, 0.0, 0.0, 0.0, 2.807081699371338, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.24607563018798828, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.08802026510238647, 0.0, 4.798876762390137, 0.0, 0.0, 0.0, 0.0, 0.0, 1.7971320152282715, 0.0, 0.0, 0.0, 0.0, 0.6014995574951172, 0.0, 0.0, 0.0, 0.08293092250823975, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.06687706708908081, 2.363736867904663, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0656093955039978, 0.0, 0.0, 0.0, 0.0, 0.2975490689277649, 1.856130599975586, 2.3225207328796387, 0.0, 0.0, 0.0, 1.0493489503860474, 0.0, 0.4142451882362366, 0.0, 1.7208514213562012, 0.0, 0.0, 0.3550347685813904, 3.4787068367004395, 0.0, 0.0, 0.0, 0.6079901456832886, 0.0, 0.0, 0.340478777885437, 1.6138036251068115, 0.0, 0.7326376438140869, 0.0, 0.22730648517608643, 0.0, 0.0, 0.0, 0.0, 0.0, 0.10450536012649536, 0.0, 0.0, 2.0135834217071533, 0.24897664785385132, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.859761357307434, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.6489973068237305, 0.0, 0.0, 0.0, 1.7460870742797852, 2.7748613357543945, 3.638943672180176, 0.0, 0.0, 0.0, 0.8629928827285767, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7381548881530762, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.2660506963729858, 0.0, 0.0, 0.0, 1.163627028465271, 3.323441982269287, 0.0, 0.0, 0.8784735798835754, 0.12142086029052734, 0.0, 0.0, 0.0, 0.5366548895835876, 0.0, 7.055778980255127, 0.0, 0.0, 1.3175956010818481, 0.9913323521614075, 3.470764398574829, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.47612810134887695, 0.4216766953468323, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.1352293491363525, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0139756202697754, 0.0, 0.0, 1.5466076135635376, 0.0, 0.23628658056259155, 0.0, 0.0, 0.0, 4.209402561187744, 0.0, 2.1705167293548584, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.36637449264526367, 0.0, 0.0, 0.0, 0.0, 0.0, 0.12444841861724854, 0.0, 0.6804463863372803, 0.0, 2.2060203552246094, 1.0301909446716309, 2.275479316711426, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.19565612077713013, 4.299452304840088, 2.4951510429382324, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.734396934509277, 7.378869533538818, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.1971901655197144, 0.0, 0.0054847002029418945, 1.01167631149292, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8595192432403564, 3.4748921394348145, 0.0, 0.0, 0.0, 0.0, 0.20970547199249268, 0.0, 4.586383819580078, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.7329846620559692, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.011681556701660156, 0.0, 4.152265548706055, 2.713533639907837, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8161318302154541, 0.7109616994857788, 0.0, 0.0, 0.0, 0.0, 1.6273472309112549, 0.29945552349090576, 0.0, 2.5414586067199707, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.4894176721572876, 0.0, 0.0, 0.5229684114456177, 0.0, 0.0, 0.8788560628890991, 1.5527154207229614, 0.0, 0.0, 0.0, 0.7281968593597412, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.917737007141113, 1.2324073314666748, 0.0, 0.0, 0.7187473773956299, 0.0, 0.6459212303161621, 0.6136354207992554, 0.9687944650650024, 0.0, 0.5010172128677368, 0.0, 3.5191659927368164, 3.5885891914367676, 0.0, 0.0, 0.0, 1.21059250831604, 0.0, 1.5585681200027466, 0.0, 0.0, 0.0, 0.0, 0.040093839168548584, 0.0, 0.0, 0.0, 0.0, 0.0, 1.1488145589828491, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3338039517402649, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.3163020610809326, 0.0, 0.0, 0.0, 0.5238884091377258, 3.122913122177124, 0.0, 0.2923773527145386, 0.0, 0.09696614742279053, 0.0, 0.0, 0.786446750164032, 1.9742887020111084, 2.0489020347595215, 0.0, 1.4442377090454102, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.2266415357589722, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 5.420008659362793, 0.0, 0.0, 0.0, 0.36060696840286255, 0.6981311440467834, 0.0, 0.0, 1.3117430210113525, 0.0, 0.0, 3.317185401916504, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.274766445159912, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.120190143585205, 0.0, 0.0, 0.0, 0.0, 6.088356018066406, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.44550180435180664, 1.083336591720581, 0.0, 0.0, 0.0, 0.0, 0.0, 4.925417900085449, 0.0, 0.0, 0.0, 1.8789260387420654, 0.0, 1.5617960691452026, 0.0, 0.0, 0.0, 1.2319889068603516, 0.0, 0.0, 4.8419036865234375, 0.0, 0.0, 0.0, 3.6748151779174805, 0.0, 0.0, 0.0, 0.0, 1.082187294960022, 0.04192739725112915, 0.0, 0.0, 0.0, 0.0, 0.0, 2.527318000793457, 0.0, 0.0, 0.0, 0.0, 6.806950092315674, 0.26768016815185547, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.89363032579422, 0.0, 0.5396299958229065, 0.0, 0.0, 0.0, 1.1321440935134888, 0.0, 1.1693124771118164, 0.0, 0.4599078297615051, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.128580331802368, 0.0, 0.0, 0.11907380819320679, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.0707991123199463, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5266770124435425, 0.0, 0.0, 0.0, 0.1586838960647583, 0.0, 0.0, 0.0, 0.0, 0.0, 0.6179089546203613, 0.0, 0.0, 0.0, 0.0, 0.688418984413147, 0.0, 1.3892816305160522, 0.0, 0.5665637254714966, 3.603506326675415, 0.0, 0.0, 0.0, 0.8253695368766785, 0.8676035404205322, 0.0, 0.0, 0.0, 0.0, 0.9052440524101257, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7437002658843994, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.778916358947754, 0.0, 0.3897927403450012, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7808846235275269, 0.0, 0.0, 1.0379260778427124, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.3833706378936768, 0.0, 0.0, 1.4965479373931885, 0.0, 0.2831999659538269, 0.0, 5.309028625488281, 0.0, 0.0, 0.0, 0.0, 0.5643512010574341, 0.0, 0.0, 0.0, 0.0, 1.7840538024902344, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.9626304507255554, 0.0, 0.0, 0.0, 0.7680637240409851, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.6512433886528015, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.947255611419678, 0.0, 0.0, 0.0, 1.603776216506958, 0.0, 0.3200625777244568, 0.0, 0.0, 1.179328203201294, 0.0, 2.2356984615325928, 0.0, 0.0, 0.0, 0.0, 3.1394190788269043, 0.0, 0.5108464956283569, 2.8017797470092773, 5.797936916351318, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2874827980995178, 0.0, 0.6252185106277466, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0577219724655151, 1.5439815521240234, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.40519583225250244, 0.4305497407913208, 0.0, 4.283257484436035, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.0149853229522705, 0.0, 0.0, 1.5292580127716064, 0.0, 0.0, 0.0, 0.0, 0.0, 1.3394668102264404, 0.0, 2.62687349319458, 0.0, 1.422012209892273, 1.8945295810699463, 0.0, 0.0, 0.0, 0.0, 1.2633991241455078, 0.0, 0.0, 3.7975845336914062, 0.0, 2.0622832775115967, 0.0, 0.0, 1.8569557666778564, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.19089674949646, 0.0, 0.0, 0.0, 0.0, 4.082639694213867, 0.0, 0.0, 0.0, 4.82659912109375, 0.0, 0.0, 1.9961439371109009, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.6544559001922607, 1.1400583982467651, 0.0, 1.4217965602874756, 0.0, 0.0, 0.9862185716629028, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.268308401107788, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.845595121383667, 0.0, 0.0, 0.0, 0.0, 0.8456485271453857, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.0930802822113037, 0.0, 0.0, 0.0, 0.0, 0.31650251150131226, 1.372741937637329, 1.9825983047485352, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7184326648712158, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.986019492149353, 0.0, 0.0, 0.0, 1.3426756858825684, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.3586530685424805, 2.429741859436035, 0.0, 0.0, 0.0, 1.8764126300811768, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.048788547515869, 0.0, 0.0, 0.0, 0.0, 2.833942413330078, 4.191957473754883, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.6387287378311157, 0.0, 0.0, 0.00985187292098999, 0.0, 0.6501505374908447, 0.0, 0.0, 0.0, 0.0, 3.341238498687744, 0.0, 0.0, 0.0, 0.7957586646080017, 0.0, 2.6826939582824707, 2.696859121322632, 0.0, 0.0, 0.6622062921524048, 0.38417524099349976, 0.0, 0.0, 0.0, 0.0, 2.3081307411193848, 3.1624815464019775, 0.0, 0.8907697200775146, 0.0, 0.0, 1.8231925964355469, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2309584617614746, 0.0, 0.0, 0.0, 2.568171501159668, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.6762423515319824, 0.0, 0.0, 1.0629469156265259, 0.0, 2.9684395790100098, 0.0, 0.0, 0.0, 0.0, 0.008752107620239258, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.612969398498535, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.30365252494812, 0.0, 2.5700130462646484, 0.4886816740036011, 0.0, 0.0, 0.0, 0.0, 0.0, 1.9046428203582764, 0.0, 0.5842533111572266, 0.0, 0.0, 1.13090980052948, 0.0, 0.0, 0.06206578016281128, 0.0, 0.0, 0.0, 0.0, 0.0, 2.4395737648010254, 0.0, 0.0, 0.0, 0.8186399340629578, 0.0, 0.5239496231079102, 0.0, 0.0, 4.763139247894287, 0.0, 0.0, 0.0, 0.0, 0.0, 1.3253532648086548, 0.0, 0.0, 0.0, 0.0, 0.0, 0.14955657720565796, 0.0, 0.0, 0.0, 2.200301170349121, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7202632427215576, 0.0, 0.4962092638015747, 0.0, 0.0, 0.9545769691467285, 1.3258696794509888, 0.0, 1.437835693359375, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.589678943157196, 0.0, 0.0, 5.259815216064453, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.917014479637146, 0.0, 0.0, 0.0, 0.0, 0.12943625450134277, 0.49780702590942383, 0.0, 0.0, 0.0, 0.8507975339889526, 0.0, 0.0, 0.0, 0.0, 0.2681591510772705, 2.3255319595336914, 3.211216449737549, 0.0, 0.04626166820526123, 0.0, 0.0, 0.0, 3.069628953933716, 0.0, 0.17585736513137817, 0.0, 0.0, 3.366549491882324, 0.0, 0.691670298576355, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.07248401641845703, 1.3474763631820679, 0.0, 0.0, 0.0, 0.8174836039543152, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.10122501850128174, 0.0, 0.0, 0.42139512300491333, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.6564059257507324, 0.0, 1.2166025638580322, 1.0042624473571777, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8253787755966187, 0.0, 0.0, 0.0, 0.0, 0.0, 1.1421819925308228, 0.0, 2.0786428451538086, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.302011489868164, 0.0, 1.0493847131729126, 0.6659832000732422, 0.7534816861152649, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0318834781646729, 0.0, 1.4998881816864014, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7555596232414246, 0.0, 0.8255386352539062, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.14749324321746826, 1.8299506902694702, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.5435270071029663, 0.0, 0.0, 0.0, 3.9865293502807617, 1.523491621017456, 0.0, 0.0, 2.0256595611572266, 0.0, 2.6214561462402344, 0.0, 0.0, 0.2875720262527466, 0.0, 0.4371689558029175, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.0903868675231934, 0.0, 0.0, 0.0, 0.0, 0.0, 2.41851806640625, 0.0, 0.0, 0.0, 2.2751612663269043, 1.3003557920455933, 0.0, 0.10724776983261108, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.2383122444152832, 0.0, 2.152719497680664, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.8465070724487305, 0.0, 0.0, 0.0, 0.0, 0.0])]

We now extract the features for all the images in our dataset using extract_features. This does the following:

  1. For each image in our dataset, we progagate it through our pre-trained neural network.
  2. At each layer of the neural network, some or all neurons are excited -- to various degrees -- by our image.
  3. We can represent the excitement of every neuron in a given layer as a vector of all of their excitements.
  4. There's an additional parameter layer_id, that allows you to choose any fully-connected layer to extract features from. The default is layer_id=None, which returns features from the penultimate layer of the pre-trained DNN of your choosing.

In [7]:
# extracted_features = pretrained_model.extract_features(image_sf[['image']]) #Caution! this may take a while
# image_sf['features'] = extracted_features ## adding the extracted_features to our SFrame

Step 3: Calculating the Distance

This is the last step in building the similar items recommdendation model. Using the features we extracted above, we are going to create a k-Nearest Neighbors model that measures the distance between all our features enables end users to find products whose images match most closely.


In [8]:
nn_model = gl.nearest_neighbors.create(image_sf, 
                                       features=['features'])  # We're using the pre-extracted features


PROGRESS: Starting brute force nearest neighbors model training.

Find Similar Items

Query 1: Blue, da ba dee da ba die


In [9]:
blue = image_sf[194:195]
blue['image'].show()


Canvas is updated and available in a tab in the default browser.

In [10]:
integer = 42 ##number of nearest neighbors you want to query, can be in the range(1, len(image_sf))

similar_to_blue = nn_model.query(blue, 
                                 k=integer, 
                                 verbose=True)

similar_to_blue


PROGRESS: Starting pairwise querying.
PROGRESS: +--------------+---------+-------------+--------------+
PROGRESS: | Query points | # Pairs | % Complete. | Elapsed Time |
PROGRESS: +--------------+---------+-------------+--------------+
PROGRESS: | 0            | 0       | 0           | 151.944ms    |
PROGRESS: | 0            | 0       | 0           | 1.15s        |
PROGRESS: | 0            | 0       | 0           | 2.16s        |
PROGRESS: | 0            | 0       | 0           | 3.16s        |
PROGRESS: | 0            | 0       | 0           | 4.17s        |
PROGRESS: | 0            | 183     | 2.3619      | 5.17s        |
PROGRESS: | 0            | 1557    | 20.0955     | 6.18s        |
PROGRESS: | 0            | 2800    | 36.1384     | 7.18s        |
PROGRESS: | 0            | 4126    | 53.2525     | 8.19s        |
PROGRESS: | 0            | 5461    | 70.4827     | 9.19s        |
PROGRESS: | 0            | 6943    | 89.6102     | 10.20s       |
PROGRESS: | Done         |         | 100         | 10.88s       |
PROGRESS: +--------------+---------+-------------+--------------+
Out[10]:
query_label reference_label distance rank
0 194 0.0 1
0 1386 25.812013028 2
0 836 26.9153061841 3
0 1186 27.0171591504 4
0 1145 29.0164940311 5
0 779 29.1384319963 6
0 1692 29.6532916369 7
0 1652 29.6687082159 8
0 4230 29.8062979896 9
0 7198 30.2582978659 10
[42 rows x 4 columns]
Note: Only the head of the SFrame is printed.
You can use print_rows(num_rows=m, num_columns=n) to print more rows and columns.

In [11]:
## To get the images, we need to join the reference label in our kNN model to our main SFrame
blue_images = image_sf.join(similar_to_blue, 
                            on={'_id':'reference_label'}
                           ).sort('distance')

## Let's show just the first 10 nearest neighbors
blue_images['image'][0:10].show() ## notice how we're taking a length 10 slice of the query we did above, of size 42.


Canvas is updated and available in a tab in the default browser.

Query 2: Similarity with more unique images


In [12]:
interesting = image_sf[2:3]
interesting['image'].show()


Canvas is updated and available in a tab in the default browser.

In [13]:
similar_to_interesting = nn_model.query(interesting, k=10, verbose=True)
similar_to_interesting


PROGRESS: Starting pairwise querying...
PROGRESS: +--------------+---------+-------------+--------------+
PROGRESS: | Query points | # Pairs | % Complete. | Elapsed Time |
PROGRESS: +--------------+---------+-------------+--------------+
PROGRESS: | 0            | 1       | 0.0129066   | 20.978ms     |
PROGRESS: | Done         |         | 100         | 839.039ms    |
PROGRESS: +--------------+---------+-------------+--------------+
Out[13]:
query_label reference_label distance rank
0 2 0.0 1
0 6888 46.2355324181 2
0 709 47.7162288346 3
0 708 48.9607068692 4
0 3987 50.1019634976 5
0 6428 50.5241608881 6
0 4750 50.567732455 7
0 959 50.9372137755 8
0 34 51.0509499528 9
0 187 51.588050309 10
[10 rows x 4 columns]

In [14]:
## To get the images, we need to join the reference label in our kNN model to our main SFrame
interesting_images = image_sf.join(similar_to_interesting, 
                                   on={'_id':'reference_label'}
                                  ).sort('distance')

## Let's show just the first 10 nearest neighbors
interesting_images['image'].show()


Canvas is accessible via web browser at the URL: http://localhost:59329/index.html
Opening Canvas in default web browser.

Query 3: Blue or Gold


In [15]:
Image('http://static.ddmcdn.com/gif/blue-dress.jpg')


Out[15]:

In [16]:
img = graphlab.Image('http://static.ddmcdn.com/gif/blue-dress.jpg')
img = graphlab.image_analysis.resize(img, 256,256,3)
sf = graphlab.SFrame()
sf['image'] = [img]
sf['features'] = pretrained_model.extract_features(sf)


PROGRESS: Downloading http://static.ddmcdn.com/gif/blue-dress.jpg to /var/tmp/graphlab-piotrteterwak/7684/000000.jpg

In [17]:
similar_to_blue_or_gold = nn_model.query(sf, k=10, verbose=True)


PROGRESS: Starting pairwise querying...
PROGRESS: +--------------+---------+-------------+--------------+
PROGRESS: | Query points | # Pairs | % Complete. | Elapsed Time |
PROGRESS: +--------------+---------+-------------+--------------+
PROGRESS: | 0            | 1       | 0.0129066   | 25.883ms     |
PROGRESS: | Done         |         | 100         | 870.674ms    |
PROGRESS: +--------------+---------+-------------+--------------+

In [18]:
blue_or_gold = image_sf.join(similar_to_blue_or_gold, 
                                   on={'_id':'reference_label'}
                                  ).sort('distance')

blue_or_gold['image'].show()


Canvas is accessible via web browser at the URL: http://localhost:59329/index.html
Opening Canvas in default web browser.

Bibliography


Olga Russakovsky, Jia Deng, Hao Su, Jonathan Krause, Sanjeev Satheesh, Sean Ma, Zhiheng Huang, Andrej Karpathy, Aditya Khosla, Michael Bernstein, Alexander C. Berg and Li Fei-Fei. (* = equal contribution) ImageNet Large Scale Visual Recognition Challenge. arXiv:1409.0575, 2014.

Donahue, J., Jia, Y., Vinyals, O., Homan, J., Zhang, N., Tzeng, E., and Darrell, T. DeCAF: A deep convolutional activation feature for generic visual recognition. In JMLR, 2014.

Krizhevsky, Alex, Ilya Sutskever, and Geoffrey E. Hinton. "Imagenet classification with deep convolutional neural networks." In Advances in neural information processing systems, pp. 1097-1105. 2012.

Yang, J., L., Y., Tian, Y., Duan, L., and Gao, W. Group-sensitive multiple kernel learning for object categorization. In ICCV, 2009.

Szegedy, Christian, Wojciech Zaremba, Ilya Sutskever, Joan Bruna, Dumitru Erhan, Ian Goodfellow, and Rob Fergus. "Intriguing properties of neural networks." arXiv preprint arXiv:1312.6199 (2013).

Nguyen, Anh, Jason Yosinski, and Jeff Clune. "Deep neural networks are easily fooled: High confidence predictions for unrecognizable images." arXiv preprint arXiv:1412.1897 (2014).

Goodfellow, Ian J., Jonathon Shlens, and Christian Szegedy. "Explaining and harnessing adversarial examples." arXiv preprint arXiv:1412.6572 (2014).