Ellipses Project (Dvir Samuel)

Data Preprocessing


In [0]:
import numpy as np
from numpy import genfromtxt
from PIL import Image
import pandas as pd
from collections import Counter
import keras
from keras.layers.normalization import BatchNormalization
from keras.models import Model
from keras.layers import Input, Dense, Dropout, Activation, Flatten, Concatenate, Add
from keras.layers import Conv2D, MaxPooling2D, GlobalAveragePooling2D
from keras import backend as K
from keras.utils import plot_model
from keras.optimizers import Adam,Nadam,SGD
from keras.callbacks import ModelCheckpoint, ReduceLROnPlateau
import matplotlib.pyplot as plt
from sklearn.utils import class_weight
%matplotlib inline

In [0]:
# Load train data
train_data = pd.read_csv("images/train_data.txt",delimiter=", ",header=0, engine='python')
# organazie the data and seperate to different dataframes
seperated_col = train_data[train_data.columns[0]].str.partition(" ")[[0,2]]
X_train_images = seperated_col[[0]].rename(columns = {0:'paths'})
Y_train_class = seperated_col[[2]].rename(columns = {2:'class'})
del train_data[train_data.columns[0]]
Y_train_features = train_data
Training images paths

In [315]:
X_train_images.head()


Out[315]:
paths
0 images/train/0000.jpg
1 images/train/0001.jpg
2 images/train/0002.jpg
3 images/train/0003.jpg
4 images/train/0004.jpg
Training true labels - ellipse or not

In [316]:
Y_train_class.head()


Out[316]:
class
0 False
1 True
2 False
3 False
4 True
Training true features - ellipse parameters

In [317]:
Y_train_features.head()


Out[317]:
center_x center_y angle axis_1 axis_2
0 0 0 0 0 0
1 34 32 25 16 17
2 0 0 0 0 0
3 0 0 0 0 0
4 18 19 150 20 15

In [0]:
# Load test data
test_data = pd.read_csv("images/test_data.txt",delimiter=", ",header=0, engine='python')
# organazie the data and seperate to different dataframes
seperated_col = test_data[test_data.columns[0]].str.partition(" ")[[0,2]]
X_test_images = seperated_col[[0]].rename(columns = {0:'paths'})
Y_test_class = seperated_col[[2]].rename(columns = {2:'class'})
del test_data[test_data.columns[0]]
Y_test_features = test_data

In [319]:
# show some data samples
fig = plt.figure(figsize=(20,20))
for idx,path in enumerate(X_train_images['paths'].head()):
    img = Image.open(path)
    fig.add_subplot(1,5,idx+1)
    plt.imshow(img)


More preprocessing (data reordering and image normalization)
  • Normalize the angles - If angles are greater than 180 degrees we can normalize them

In [0]:
Y_train_features['angle'] = Y_train_features['angle']%180
Y_test_features['angle'] = Y_test_features['angle']%180
  • We would like to constrain the features so that the angle will allways be the ratio between the long axis and the x_axis now, it is just the ratio between the first axis and the x axis (as we can see in the images), so we make sure that the long axis is the first axis (axis_1) and the angle is adjusted accordingly.

In [321]:
# For TRAIN
# check where axis_1 is smaller than axis_2
indices = np.where(Y_train_features['axis_1'] < Y_train_features['axis_2'])[0]
print("Before:")
display(Y_train_features.loc[indices].head())
# swap axis_1 with axis_2
tmp = Y_train_features.loc[indices,'axis_1']
Y_train_features.loc[indices,'axis_1'] = Y_train_features.loc[indices,'axis_2']
Y_train_features.loc[indices,'axis_2'] = tmp
# rotate angle by 90
Y_train_features.loc[indices,'angle'] = (Y_train_features.loc[indices,'angle']+90)%180
print("After:")
display(Y_train_features.loc[indices].head())

# For TEST
# check where axis_1 is smaller than axis_2
indices = np.where(Y_test_features['axis_1'] < Y_test_features['axis_2'])[0]
# swap axis_1 with axis_2
tmp = Y_test_features.loc[indices,'axis_1']
Y_test_features.loc[indices,'axis_1'] = Y_test_features.loc[indices,'axis_2']
Y_test_features.loc[indices,'axis_2'] = tmp
# rotate angle by 90
Y_test_features.loc[indices,'angle'] = (Y_test_features.loc[indices,'angle']+90)%180


Before:
center_x center_y angle axis_1 axis_2
1 34 32 25 16 17
9 20 19 81 15 16
10 26 27 61 15 24
15 26 28 177 15 18
16 24 25 160 4 13
After:
center_x center_y angle axis_1 axis_2
1 34 32 115 17 16
9 20 19 171 16 15
10 26 27 151 24 15
15 26 28 87 18 15
16 24 25 70 13 4

In [322]:
# load train images to memory
X_train_o = np.array([np.array(Image.open(path)) for path in X_train_images['paths']]).astype('float32')
X_test_o = np.array([np.array(Image.open(path)) for path in X_test_images['paths']]).astype('float32')
print("Train shape:", X_train_o.shape)
print("Test shape:", X_test_o.shape)


Train shape: (10000, 50, 50, 3)
Test shape: (1000, 50, 50, 3)

In [0]:
# normalize rgb images
X_train = X_train_o / 255
X_test = X_test_o / 255
# y to one-hote encoding
Y_train_class['class'] = (Y_train_class['class'].values == 'True').astype(float)
Y_test_class['class'] = (Y_test_class['class'].values == 'True').astype(float)

Data Distribuation

Check if data is balanced


In [324]:
print("Train Data:")
print(Counter(Y_train_class['class']))
print("Test Data")
print(Counter(Y_test_class['class']))


Train Data:
Counter({1.0: 6997, 0.0: 3003})
Test Data
Counter({1.0: 688, 0.0: 312})

As we can see, there are more ellipses than non-ellipses (greater by almost 2.4) - in training we will give all non ellipses data more weight


In [325]:
class_weights = class_weight.compute_class_weight('balanced', np.unique(Y_train_class), Y_train_class["class"])
class_weights = {cid : weight for cid, weight in enumerate(class_weights)}
print(class_weights)


{0: 1.665001665001665, 1: 0.7145919679862799}

Approach 1: Image Processing method

I will try first to see if we could get reasonable results using opencv with traditional image processing methods

I tried playing with contour and fitEllipse, but it did not give good results. Most of the time it couldn't find the ellipses if they where drawn in the image..

Approach 2: Deep Learning method

I will try and implement a cnn network which outputs True/False if , and the features


In [326]:
# build the model
inputs = Input(shape=X_train.shape[1:])
conv1 = Conv2D(32, (3, 3), padding='same', name='conv1')(inputs)
bn1 = BatchNormalization(axis=3,name='bn1')(conv1)
act1 = Activation('relu',name='act1')(bn1)
mp1 = MaxPooling2D(padding='same',name='mp1')(act1)
conv2 = Conv2D(64, (3, 3), padding='same',name='conv2')(mp1)
bn2 = BatchNormalization(axis=3,name='bn2')(conv2)
act2 = Activation('relu',name='act2')(bn2)
conv3 = Conv2D(64, (3, 3), padding='same',name='conv3')(act2)
conv4 = Conv2D(64, (1,1),name='conv4')(mp1)
add1 = Add(name='add1')([conv3, conv4])
bn3 = BatchNormalization(axis=3,name='bn3')(add1)
act3 = Activation('relu',name='act3')(bn3)
conv5 = Conv2D(128, (3, 3), padding='same', strides=(2,2),name='conv5')(act3)
bn4 = BatchNormalization(axis=3,name='bn4')(conv5)
act4 = Activation('relu',name='act4')(bn4)
conv6 = Conv2D(128, (3, 3), padding='same',name='conv6')(act4)
conv7 = Conv2D(128, (1,1), strides=(2,2),name='conv7')(add1)
add2 = Add(name='add2')([conv6, conv7])
bn5 = BatchNormalization(axis=3,name='bn5')(add2)
act5 = Activation('relu',name='act5')(bn5)
flat1 = Flatten(name='flat1')(act5)
dense1 = Dense(512,name='dense1')(flat1)
bn6 = BatchNormalization(name='bn6')(dense1)
act6 = Activation('relu',name='act6')(bn6)
do1 = Dropout(0.2,name='do1')(act6)
dense2 = Dense(1,name='dense2')(do1)
classification = Activation('sigmoid', name = 'class')(dense2)
dense3 = Dense(5,name='dense3')(do1)
features = Activation('relu',name='features')(dense3)

model = Model(inputs=inputs, outputs=[classification,features])

model.summary()
plot_model(model, to_file="model.png")


__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_28 (InputLayer)           (None, 50, 50, 3)    0                                            
__________________________________________________________________________________________________
conv1 (Conv2D)                  (None, 50, 50, 32)   896         input_28[0][0]                   
__________________________________________________________________________________________________
bn1 (BatchNormalization)        (None, 50, 50, 32)   128         conv1[0][0]                      
__________________________________________________________________________________________________
act1 (Activation)               (None, 50, 50, 32)   0           bn1[0][0]                        
__________________________________________________________________________________________________
mp1 (MaxPooling2D)              (None, 25, 25, 32)   0           act1[0][0]                       
__________________________________________________________________________________________________
conv2 (Conv2D)                  (None, 25, 25, 64)   18496       mp1[0][0]                        
__________________________________________________________________________________________________
bn2 (BatchNormalization)        (None, 25, 25, 64)   256         conv2[0][0]                      
__________________________________________________________________________________________________
act2 (Activation)               (None, 25, 25, 64)   0           bn2[0][0]                        
__________________________________________________________________________________________________
conv3 (Conv2D)                  (None, 25, 25, 64)   36928       act2[0][0]                       
__________________________________________________________________________________________________
conv4 (Conv2D)                  (None, 25, 25, 64)   2112        mp1[0][0]                        
__________________________________________________________________________________________________
add1 (Add)                      (None, 25, 25, 64)   0           conv3[0][0]                      
                                                                 conv4[0][0]                      
__________________________________________________________________________________________________
bn3 (BatchNormalization)        (None, 25, 25, 64)   256         add1[0][0]                       
__________________________________________________________________________________________________
act3 (Activation)               (None, 25, 25, 64)   0           bn3[0][0]                        
__________________________________________________________________________________________________
conv5 (Conv2D)                  (None, 13, 13, 128)  73856       act3[0][0]                       
__________________________________________________________________________________________________
bn4 (BatchNormalization)        (None, 13, 13, 128)  512         conv5[0][0]                      
__________________________________________________________________________________________________
act4 (Activation)               (None, 13, 13, 128)  0           bn4[0][0]                        
__________________________________________________________________________________________________
conv6 (Conv2D)                  (None, 13, 13, 128)  147584      act4[0][0]                       
__________________________________________________________________________________________________
conv7 (Conv2D)                  (None, 13, 13, 128)  8320        add1[0][0]                       
__________________________________________________________________________________________________
add2 (Add)                      (None, 13, 13, 128)  0           conv6[0][0]                      
                                                                 conv7[0][0]                      
__________________________________________________________________________________________________
bn5 (BatchNormalization)        (None, 13, 13, 128)  512         add2[0][0]                       
__________________________________________________________________________________________________
act5 (Activation)               (None, 13, 13, 128)  0           bn5[0][0]                        
__________________________________________________________________________________________________
flat1 (Flatten)                 (None, 21632)        0           act5[0][0]                       
__________________________________________________________________________________________________
dense1 (Dense)                  (None, 512)          11076096    flat1[0][0]                      
__________________________________________________________________________________________________
bn6 (BatchNormalization)        (None, 512)          2048        dense1[0][0]                     
__________________________________________________________________________________________________
act6 (Activation)               (None, 512)          0           bn6[0][0]                        
__________________________________________________________________________________________________
do1 (Dropout)                   (None, 512)          0           act6[0][0]                       
__________________________________________________________________________________________________
dense2 (Dense)                  (None, 1)            513         do1[0][0]                        
__________________________________________________________________________________________________
dense3 (Dense)                  (None, 5)            2565        do1[0][0]                        
__________________________________________________________________________________________________
class (Activation)              (None, 1)            0           dense2[0][0]                     
__________________________________________________________________________________________________
features (Activation)           (None, 5)            0           dense3[0][0]                     
==================================================================================================
Total params: 11,371,078
Trainable params: 11,369,222
Non-trainable params: 1,856
__________________________________________________________________________________________________

In [327]:
print("Model structure:")
fig = plt.figure(figsize=(25,30))
model_img = Image.open("model.png")
plt.imshow(model_img)


Model structure:
Out[327]:
<matplotlib.image.AxesImage at 0x7f22a2c67ef0>

Model training


In [0]:
# hyperparams
epochs = 50
batch_size = 64
optAlgo = Adam()

# compile model
model.compile(optAlgo,loss=['binary_crossentropy','mse'], metrics=['accuracy'],loss_weights=[1.,.01])

# values to track:
save_to = 'best_model.h5'
checkpoint = ModelCheckpoint(filepath=save_to,verbose=0, save_best_only=True)
callbacks = [checkpoint]

In [329]:
# train model
history = model.fit(X_train, [Y_train_class,Y_train_features], shuffle=True, batch_size=batch_size, 
                             epochs=epochs, verbose=1, validation_split=0.3, callbacks = callbacks)


Train on 7000 samples, validate on 3000 samples
Epoch 1/50
7000/7000 [==============================] - 17s 2ms/step - loss: 10.8294 - class_loss: 0.1448 - features_loss: 1068.4537 - class_acc: 0.9486 - features_acc: 0.5153 - val_loss: 10.5512 - val_class_loss: 0.9715 - val_features_loss: 957.9696 - val_class_acc: 0.7003 - val_features_acc: 0.5167
Epoch 2/50
7000/7000 [==============================] - 8s 1ms/step - loss: 3.8368 - class_loss: 0.0517 - features_loss: 378.5116 - class_acc: 0.9846 - features_acc: 0.7049 - val_loss: 29.1661 - val_class_loss: 11.3740 - val_features_loss: 1779.2139 - val_class_acc: 0.2943 - val_features_acc: 0.3447
Epoch 3/50
7000/7000 [==============================] - 8s 1ms/step - loss: 2.0242 - class_loss: 0.0290 - features_loss: 199.5207 - class_acc: 0.9917 - features_acc: 0.7614 - val_loss: 21.5872 - val_class_loss: 6.2837 - val_features_loss: 1530.3551 - val_class_acc: 0.3513 - val_features_acc: 0.5870
Epoch 4/50
7000/7000 [==============================] - 8s 1ms/step - loss: 1.7765 - class_loss: 0.0263 - features_loss: 175.0226 - class_acc: 0.9914 - features_acc: 0.7843 - val_loss: 28.9654 - val_class_loss: 11.1783 - val_features_loss: 1778.7097 - val_class_acc: 0.2943 - val_features_acc: 0.3483
Epoch 5/50
7000/7000 [==============================] - 8s 1ms/step - loss: 1.4826 - class_loss: 0.0180 - features_loss: 146.4652 - class_acc: 0.9953 - features_acc: 0.8054 - val_loss: 12.9822 - val_class_loss: 2.6349 - val_features_loss: 1034.7276 - val_class_acc: 0.7063 - val_features_acc: 0.5920
Epoch 6/50
7000/7000 [==============================] - 8s 1ms/step - loss: 1.2916 - class_loss: 0.0112 - features_loss: 128.0408 - class_acc: 0.9973 - features_acc: 0.8249 - val_loss: 5.7238 - val_class_loss: 1.1425 - val_features_loss: 458.1265 - val_class_acc: 0.7413 - val_features_acc: 0.7087
Epoch 7/50
7000/7000 [==============================] - 8s 1ms/step - loss: 1.1563 - class_loss: 0.0113 - features_loss: 114.5037 - class_acc: 0.9964 - features_acc: 0.8476 - val_loss: 9.3236 - val_class_loss: 2.0869 - val_features_loss: 723.6677 - val_class_acc: 0.7070 - val_features_acc: 0.5990
Epoch 8/50
7000/7000 [==============================] - 8s 1ms/step - loss: 1.0565 - class_loss: 0.0112 - features_loss: 104.5290 - class_acc: 0.9969 - features_acc: 0.8521 - val_loss: 19.1562 - val_class_loss: 3.1398 - val_features_loss: 1601.6349 - val_class_acc: 0.7067 - val_features_acc: 0.5920
Epoch 9/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.8177 - class_loss: 0.0047 - features_loss: 81.2919 - class_acc: 0.9990 - features_acc: 0.8659 - val_loss: 11.7534 - val_class_loss: 2.5790 - val_features_loss: 917.4462 - val_class_acc: 0.7200 - val_features_acc: 0.5923
Epoch 10/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.7562 - class_loss: 0.0108 - features_loss: 74.5440 - class_acc: 0.9966 - features_acc: 0.8641 - val_loss: 13.1303 - val_class_loss: 1.2666 - val_features_loss: 1186.3668 - val_class_acc: 0.7420 - val_features_acc: 0.5923
Epoch 11/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.5876 - class_loss: 0.0031 - features_loss: 58.4437 - class_acc: 0.9989 - features_acc: 0.8866 - val_loss: 11.7086 - val_class_loss: 2.7054 - val_features_loss: 900.3190 - val_class_acc: 0.7077 - val_features_acc: 0.5920
Epoch 12/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.4532 - class_loss: 0.0017 - features_loss: 45.1452 - class_acc: 0.9999 - features_acc: 0.8970 - val_loss: 4.6767 - val_class_loss: 0.2938 - val_features_loss: 438.2862 - val_class_acc: 0.9007 - val_features_acc: 0.6397
Epoch 13/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.3918 - class_loss: 0.0013 - features_loss: 39.0481 - class_acc: 0.9997 - features_acc: 0.9070 - val_loss: 13.9242 - val_class_loss: 3.5958 - val_features_loss: 1032.8414 - val_class_acc: 0.7060 - val_features_acc: 0.5920
Epoch 14/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.3657 - class_loss: 0.0017 - features_loss: 36.4013 - class_acc: 0.9996 - features_acc: 0.9167 - val_loss: 2.5552 - val_class_loss: 0.2512 - val_features_loss: 230.4003 - val_class_acc: 0.9213 - val_features_acc: 0.6383
Epoch 15/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.2922 - class_loss: 8.4930e-04 - features_loss: 29.1357 - class_acc: 0.9999 - features_acc: 0.9214 - val_loss: 1.8963 - val_class_loss: 0.0221 - val_features_loss: 187.4237 - val_class_acc: 0.9910 - val_features_acc: 0.6773
Epoch 16/50
7000/7000 [==============================] - 9s 1ms/step - loss: 0.2785 - class_loss: 3.9760e-04 - features_loss: 27.8080 - class_acc: 1.0000 - features_acc: 0.9233 - val_loss: 3.4773 - val_class_loss: 0.4469 - val_features_loss: 303.0363 - val_class_acc: 0.8783 - val_features_acc: 0.6283
Epoch 17/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.2452 - class_loss: 6.8853e-04 - features_loss: 24.4510 - class_acc: 0.9999 - features_acc: 0.9347 - val_loss: 4.1712 - val_class_loss: 0.3624 - val_features_loss: 380.8826 - val_class_acc: 0.9443 - val_features_acc: 0.7600
Epoch 18/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.2723 - class_loss: 0.0032 - features_loss: 26.9099 - class_acc: 0.9990 - features_acc: 0.9323 - val_loss: 14.5993 - val_class_loss: 3.5775 - val_features_loss: 1102.1792 - val_class_acc: 0.7057 - val_features_acc: 0.5920
Epoch 19/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.2597 - class_loss: 0.0027 - features_loss: 25.7056 - class_acc: 0.9991 - features_acc: 0.9370 - val_loss: 9.0545 - val_class_loss: 2.3833 - val_features_loss: 667.1223 - val_class_acc: 0.7150 - val_features_acc: 0.6120
Epoch 20/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.2412 - class_loss: 0.0011 - features_loss: 24.0122 - class_acc: 0.9997 - features_acc: 0.9429 - val_loss: 2.2649 - val_class_loss: 0.0678 - val_features_loss: 219.7012 - val_class_acc: 0.9887 - val_features_acc: 0.9157
Epoch 21/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.2162 - class_loss: 2.0164e-04 - features_loss: 21.6009 - class_acc: 1.0000 - features_acc: 0.9486 - val_loss: 11.5993 - val_class_loss: 1.2268 - val_features_loss: 1037.2513 - val_class_acc: 0.7363 - val_features_acc: 0.5953
Epoch 22/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.2118 - class_loss: 3.0694e-04 - features_loss: 21.1533 - class_acc: 1.0000 - features_acc: 0.9474 - val_loss: 14.6615 - val_class_loss: 3.5237 - val_features_loss: 1113.7791 - val_class_acc: 0.7087 - val_features_acc: 0.5927
Epoch 23/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.2237 - class_loss: 0.0014 - features_loss: 22.2290 - class_acc: 0.9996 - features_acc: 0.9403 - val_loss: 3.9270 - val_class_loss: 0.0459 - val_features_loss: 388.1073 - val_class_acc: 0.9823 - val_features_acc: 0.6297
Epoch 24/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1830 - class_loss: 2.1850e-04 - features_loss: 18.2802 - class_acc: 1.0000 - features_acc: 0.9549 - val_loss: 2.6152 - val_class_loss: 0.1551 - val_features_loss: 246.0081 - val_class_acc: 0.9707 - val_features_acc: 0.8820
Epoch 25/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1831 - class_loss: 2.1331e-04 - features_loss: 18.2865 - class_acc: 1.0000 - features_acc: 0.9544 - val_loss: 2.3945 - val_class_loss: 0.0671 - val_features_loss: 232.7457 - val_class_acc: 0.9880 - val_features_acc: 0.9150
Epoch 26/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.3130 - class_loss: 0.0145 - features_loss: 29.8464 - class_acc: 0.9949 - features_acc: 0.9374 - val_loss: 29.1661 - val_class_loss: 11.3740 - val_features_loss: 1779.2139 - val_class_acc: 0.2943 - val_features_acc: 0.3447
Epoch 27/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.5472 - class_loss: 0.0282 - features_loss: 51.9006 - class_acc: 0.9923 - features_acc: 0.9051 - val_loss: 17.6174 - val_class_loss: 2.6951 - val_features_loss: 1492.2268 - val_class_acc: 0.6193 - val_features_acc: 0.5383
Epoch 28/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.2909 - class_loss: 0.0019 - features_loss: 28.9050 - class_acc: 0.9994 - features_acc: 0.9397 - val_loss: 15.9246 - val_class_loss: 3.9881 - val_features_loss: 1193.6467 - val_class_acc: 0.7067 - val_features_acc: 0.5920
Epoch 29/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1936 - class_loss: 5.2164e-04 - features_loss: 19.3108 - class_acc: 1.0000 - features_acc: 0.9554 - val_loss: 13.2837 - val_class_loss: 2.3017 - val_features_loss: 1098.1985 - val_class_acc: 0.7330 - val_features_acc: 0.6003
Epoch 30/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1622 - class_loss: 3.0444e-04 - features_loss: 16.1876 - class_acc: 1.0000 - features_acc: 0.9576 - val_loss: 11.8859 - val_class_loss: 4.0488 - val_features_loss: 783.7060 - val_class_acc: 0.7067 - val_features_acc: 0.5920
Epoch 31/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1678 - class_loss: 4.9130e-04 - features_loss: 16.7317 - class_acc: 1.0000 - features_acc: 0.9543 - val_loss: 15.5311 - val_class_loss: 3.7586 - val_features_loss: 1177.2509 - val_class_acc: 0.7067 - val_features_acc: 0.5940
Epoch 32/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1463 - class_loss: 2.8496e-04 - features_loss: 14.5966 - class_acc: 1.0000 - features_acc: 0.9587 - val_loss: 3.7972 - val_class_loss: 0.7580 - val_features_loss: 303.9147 - val_class_acc: 0.8200 - val_features_acc: 0.6497
Epoch 33/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1444 - class_loss: 1.8008e-04 - features_loss: 14.4200 - class_acc: 1.0000 - features_acc: 0.9614 - val_loss: 5.9506 - val_class_loss: 0.5128 - val_features_loss: 543.7827 - val_class_acc: 0.9063 - val_features_acc: 0.8470
Epoch 34/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1434 - class_loss: 3.0296e-04 - features_loss: 14.3054 - class_acc: 1.0000 - features_acc: 0.9580 - val_loss: 1.7689 - val_class_loss: 0.0350 - val_features_loss: 173.3881 - val_class_acc: 0.9887 - val_features_acc: 0.8107
Epoch 35/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1450 - class_loss: 4.0905e-04 - features_loss: 14.4622 - class_acc: 0.9999 - features_acc: 0.9587 - val_loss: 2.2573 - val_class_loss: 0.0988 - val_features_loss: 215.8562 - val_class_acc: 0.9527 - val_features_acc: 0.6603
Epoch 36/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1547 - class_loss: 1.4593e-04 - features_loss: 15.4565 - class_acc: 1.0000 - features_acc: 0.9609 - val_loss: 11.8610 - val_class_loss: 2.7836 - val_features_loss: 907.7497 - val_class_acc: 0.7340 - val_features_acc: 0.5947
Epoch 37/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1491 - class_loss: 1.0867e-04 - features_loss: 14.8973 - class_acc: 1.0000 - features_acc: 0.9594 - val_loss: 1.7409 - val_class_loss: 0.0284 - val_features_loss: 171.2533 - val_class_acc: 0.9950 - val_features_acc: 0.9253
Epoch 38/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1432 - class_loss: 9.9859e-05 - features_loss: 14.3100 - class_acc: 1.0000 - features_acc: 0.9664 - val_loss: 12.3351 - val_class_loss: 2.4550 - val_features_loss: 988.0021 - val_class_acc: 0.7413 - val_features_acc: 0.5947
Epoch 39/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1332 - class_loss: 1.2234e-04 - features_loss: 13.3038 - class_acc: 1.0000 - features_acc: 0.9643 - val_loss: 2.9682 - val_class_loss: 0.0783 - val_features_loss: 288.9950 - val_class_acc: 0.9747 - val_features_acc: 0.7110
Epoch 40/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1439 - class_loss: 1.0725e-04 - features_loss: 14.3829 - class_acc: 1.0000 - features_acc: 0.9629 - val_loss: 2.9160 - val_class_loss: 0.1196 - val_features_loss: 279.6394 - val_class_acc: 0.9823 - val_features_acc: 0.8610
Epoch 41/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1358 - class_loss: 2.3824e-04 - features_loss: 13.5561 - class_acc: 1.0000 - features_acc: 0.9589 - val_loss: 9.3048 - val_class_loss: 3.1435 - val_features_loss: 616.1346 - val_class_acc: 0.7087 - val_features_acc: 0.5923
Epoch 42/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1448 - class_loss: 1.9449e-04 - features_loss: 14.4578 - class_acc: 1.0000 - features_acc: 0.9633 - val_loss: 2.5233 - val_class_loss: 0.2807 - val_features_loss: 224.2669 - val_class_acc: 0.9120 - val_features_acc: 0.6727
Epoch 43/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.2086 - class_loss: 0.0068 - features_loss: 20.1849 - class_acc: 0.9983 - features_acc: 0.9496 - val_loss: 19.3632 - val_class_loss: 4.6908 - val_features_loss: 1467.2480 - val_class_acc: 0.7057 - val_features_acc: 0.5920
Epoch 44/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1734 - class_loss: 6.2381e-04 - features_loss: 17.2814 - class_acc: 0.9997 - features_acc: 0.9569 - val_loss: 9.1110 - val_class_loss: 1.9863 - val_features_loss: 712.4671 - val_class_acc: 0.7647 - val_features_acc: 0.6177
Epoch 45/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1612 - class_loss: 5.6875e-04 - features_loss: 16.0647 - class_acc: 0.9999 - features_acc: 0.9591 - val_loss: 21.1116 - val_class_loss: 5.8439 - val_features_loss: 1526.7732 - val_class_acc: 0.4293 - val_features_acc: 0.4563
Epoch 46/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1503 - class_loss: 8.8300e-04 - features_loss: 14.9406 - class_acc: 0.9997 - features_acc: 0.9629 - val_loss: 2.0129 - val_class_loss: 0.0483 - val_features_loss: 196.4606 - val_class_acc: 0.9910 - val_features_acc: 0.9180
Epoch 47/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1959 - class_loss: 0.0076 - features_loss: 18.8292 - class_acc: 0.9980 - features_acc: 0.9523 - val_loss: 22.6670 - val_class_loss: 6.3410 - val_features_loss: 1632.6016 - val_class_acc: 0.4423 - val_features_acc: 0.4300
Epoch 48/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1475 - class_loss: 5.3088e-04 - features_loss: 14.6976 - class_acc: 1.0000 - features_acc: 0.9613 - val_loss: 10.1780 - val_class_loss: 2.7504 - val_features_loss: 742.7626 - val_class_acc: 0.7283 - val_features_acc: 0.5950
Epoch 49/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1364 - class_loss: 1.0950e-04 - features_loss: 13.6242 - class_acc: 1.0000 - features_acc: 0.9659 - val_loss: 6.4206 - val_class_loss: 0.5300 - val_features_loss: 589.0608 - val_class_acc: 0.9050 - val_features_acc: 0.8000
Epoch 50/50
7000/7000 [==============================] - 8s 1ms/step - loss: 0.1275 - class_loss: 1.5437e-04 - features_loss: 12.7307 - class_acc: 1.0000 - features_acc: 0.9666 - val_loss: 8.1595 - val_class_loss: 1.3029 - val_features_loss: 685.6563 - val_class_acc: 0.7920 - val_features_acc: 0.7730

In [330]:
# Plot training & validation accuracy values
plt.plot(history.history['class_acc'])
plt.plot(history.history['features_acc'])
plt.plot(history.history['val_class_acc'])
plt.plot(history.history['val_features_acc'])
plt.title('Model accuracy')
plt.ylabel('Accuracy')
plt.xlabel('Epoch')
plt.legend(['class_acc','features_acc','val_class_acc', 'val_features_acc'], loc='lower left')
plt.show()



In [331]:
# Plot training & validation loss values (general loss)
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('Model loss (General Loss)')
plt.ylabel('Loss')
plt.xlabel('Epoch')
plt.legend(['loss','val_loss'], loc='upper left')
plt.show()



In [332]:
# Plot training & validation loss values (classification loss)
plt.plot(history.history['class_loss'])
plt.plot(history.history['val_class_loss'])
plt.title('Model loss (Classification Loss)')
plt.ylabel('Loss')
plt.xlabel('Epoch')
plt.legend(['class_loss','val_class_loss'], loc='upper left')
plt.show()



In [333]:
# Plot training & validation loss values (features loss)
plt.plot(history.history['features_loss'])
plt.plot(history.history['val_features_loss'])
plt.title('Model loss (Features Loss)')
plt.ylabel('Loss')
plt.xlabel('Epoch')
plt.legend(['features_loss','val_features_loss'], loc='upper left')
plt.show()


Model evaluation


In [334]:
# load best model wights
model.load_weights(save_to)
# loss,features_l,ellipses_l,featres_a,ellipses_a
#Y_test_features =Y_test_features.drop(columns=['angle'])
_,_,_,acc_classes,acc_features = model.evaluate(X_test, [Y_test_class,Y_test_features])
print('Test Features Acc:', acc_features)
print('Test Classification Acc:', acc_classes)


1000/1000 [==============================] - 0s 475us/step
Test Features Acc: 0.92
Test Classification Acc: 0.993

In [335]:
# Check errors
features_predicted = model.predict(X_test)[1]
for feat,error in zip(["center_x","center_y","angle","axis_1","axis_2"],
                      (pd.DataFrame(features_predicted) - Y_test_features.values).abs().mean()):
    if feat == "angle":
        continue #we will evaluate error later...
    print(feat,"avg error:",error,"pixels")

# check error percentages for angle
predictions = pd.DataFrame(features_predicted)
ellipses_idx = np.where(Y_test_class['class'] == 1)[0]
angle_idx = Y_test_features.columns.get_loc("angle")
true_angles = Y_test_features.loc[ellipses_idx,"angle"]
predicted_angles = predictions.loc[ellipses_idx,predictions.columns[angle_idx]]
err_percentages = (((true_angles - predicted_angles).abs() +180)%360 - 180)/true_angles
print("Avg angle error:", np.mean(err_percentages[err_percentages != np.inf]))


center_x avg error: 1.93428353536129 pixels
center_y avg error: 1.891286836028099 pixels
axis_1 avg error: 1.1033067197799682 pixels
axis_2 avg error: 1.0726100867986679 pixels
Avg angle error: 0.09177505779148392