Pipelines Test Library Notebook

Imports


In [1]:
# Import piplines

from binary_nets import binary_nets_wrapper # this is actually pipeline1
from pipeline2 import pipeline2
from pipeline3 import pipeline3
from pipeline4 import pipeline4

from ensemble_builder import ensemble_builder

#Import datasets
import dataset_generator as dataset

from time import time
import os


Using TensorFlow backend.

Data Load


In [2]:
(X_train_ocr, y_train_ocr, X_test_ocr, y_test_ocr, _) = dataset.generate_all_chars_with_class(verbose=0, plot=False)

(X_train_cut, y_train_cut, X_test_cut, y_test_cut) = dataset.generate_dataset_for_segmentator(verbose=0, plot=False)


X_train_char = {}
y_train_char = {}
X_test_char = {}
y_test_char = {}

for char in dataset.ALPHABET_ALL:
    (X_train_char[char], y_train_char[char], X_test_char[char], y_test_char[char]) = dataset.generate_positive_and_negative_labeled(char, verbose=0)

Setting and training of the nets

Binary nets


In [3]:
path_b_nets = "checkpoints/testpip/pipeline1"

# binary nets
binary_nets = {}

for letter in dataset.ALPHABET_ALL:
    # Create the binary net
    letter_path = os.path.join(path_b_nets, letter)
    binary_nets[letter]= ensemble_builder(2, 2, number_of_nets=2, \
                path=letter_path, nb_filters1=20, nb_filters2=40, dense_layer_size1=150)
    
    # Training
    binary_nets[letter].fit(X_train_char[letter], y_train_char[letter], X_test_char[letter], y_test_char[letter])


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Cut classifier


In [4]:
path_cut_classifier = "checkpoints/testpip/cut_classifier"

cut_class = ensemble_builder(2, 2, number_of_nets=2,\
                 path=path_cut_classifier, nb_filters1=50, nb_filters2=100,\
                 dense_layer_size1=250)

cut_class.fit(X_train_cut, y_train_cut, X_test_cut, y_test_cut)


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Ocr classifier


In [5]:
path_ocr_class = "checkpoints/testpip/ocr_classifier"

ocr_classifier = ensemble_builder(21+1, 2, number_of_nets=2, path=path_ocr_class,\
            nb_filters1=50, nb_filters2=100, dense_layer_size1=250)

ocr_classifier.fit(X_train_ocr, y_train_ocr, X_test_ocr, y_test_ocr)


Training model 0 ...
Not pre-processing 1 epoch(s)
Training model 1 ...
Not pre-processing 1 epoch(s)
Done.


Pipelines

Pipeline 1


In [6]:
pip1 = binary_nets_wrapper(binary_nets)

prediction_pip1 = pip1.predict(X_test_ocr)

Pipeline 2


In [7]:
pip2 = pipeline2(cut_class, ocr_classifier)

prediction_pip2 = pip2.predict(X_test_ocr)

Pipeline 3


In [8]:
pip3 = pipeline3(pip1, ocr_classifier)

prediction_pip3 = pip3.predict(X_test_ocr)

Pipeline 4


In [9]:
pip4 = pipeline4(cut_class, pip1)

prediction_pip4 = pip4.predict(X_test_ocr)