Experiments with Similarity Encoders

...to show that SimEc can create similarity preserving embeddings based on human ratings

In this iPython Notebook are some examples to illustrate the potential of Similarity Encoders (SimEc) for creating similarity preserving embeddings. For further details and theoretical background on this new neural network architecture, please refer to the corresponding paper.


In [1]:
from __future__ import unicode_literals, division, print_function, absolute_import
from builtins import range
import numpy as np
np.random.seed(28)
import matplotlib.pyplot as plt
from sklearn.linear_model import Ridge
from sklearn.decomposition import PCA, KernelPCA
from sklearn.datasets import load_digits, fetch_mldata, fetch_20newsgroups
from sklearn.neighbors import KNeighborsClassifier as KNN
from sklearn.preprocessing import StandardScaler
from sklearn.model_selection import GridSearchCV
import tensorflow as tf
tf.set_random_seed(28)
import keras

# find nlputils at https://github.com/cod3licious/nlputils
from nlputils.features import FeatureTransform, features2mat

from simec import SimilarityEncoder
from utils import center_K, check_similarity_match
from utils_plotting import get_colors, plot_digits, plot_mnist, plot_20news

%matplotlib inline
%load_ext autoreload
%autoreload 2
# set this to True if you want to save the figures from the paper
savefigs = False


Using TensorFlow backend.

Handwritten Digits (8x8 px)

See http://scikit-learn.org/stable/auto_examples/datasets/plot_digits_last_image.html


In [2]:
# load digits dataset
digits = load_digits()
X = digits.data
X /= float(X.max())
ss = StandardScaler(with_std=False)
X = ss.fit_transform(X)
y = digits.target
n_samples, n_features = X.shape

SimEc based on class labels

We've seen that SimEcs can reach the same solutions as traditional spectral methods such as kPCA and isomap. However, these methods have the limitation that you can only embed new data points if you can compute their kernel map, i.e. the similarity to the training examples. But what if the similarity matrix used as targets during training was generated by an unknown process such as human similarity judgments?

To show how we can use SimEc in such a scenario, we construct the similarity matrix from the class labels assigned by human annotators (1=same class, 0=different class).


In [3]:
Y = np.tile(y, (len(y), 1))
S = center_K(np.array(Y==Y.T, dtype=int))
# take only some of the samples as targets to speed it all up
n_targets = 1000

In [4]:
# knn accuracy using all original feature dimensions
clf = KNN(n_neighbors=10)
clf.fit(X[:n_targets], y[:n_targets])
print("knn accuracy: %f" % clf.score(X[n_targets:], y[n_targets:]))


knn accuracy: 0.956085

In [5]:
# PCA
pca = PCA(n_components=2)
X_embedp = pca.fit_transform(X)
plot_digits(X_embedp, digits, title='Digits embedded with PCA')
clf = KNN(n_neighbors=10)
clf.fit(X_embedp[:n_targets], y[:n_targets])
print("knn accuracy: %f" % clf.score(X_embedp[n_targets:], y[n_targets:]))


knn accuracy: 0.608532

In [6]:
# check how many relevant dimensions there are
eigenvals = np.linalg.eigvalsh(S)[::-1]
plt.figure();
plt.plot(list(range(1, S.shape[0]+1)), eigenvals, '-o', markersize=3);
plt.plot([1, S.shape[0]],[0,0], 'k--', linewidth=0.5);
plt.xlim(1, X.shape[1]+1);
plt.title('Eigenvalue spectrum of S (based on class labels)');



In [7]:
D, V = np.linalg.eig(S)
# regular kpca embedding: take largest EV
D1, V1 = D[np.argsort(D)[::-1]], V[:,np.argsort(D)[::-1]]
X_embed = np.dot(V1.real, np.diag(np.sqrt(np.abs(D1.real))))
plot_digits(X_embed[:,:2], digits, title='Digits embedded based on first 2 components', plot_box=False)
clf = KNN(n_neighbors=10)
clf.fit(X_embed[:n_targets,:2], y[:n_targets])
print("knn accuracy: %f" % clf.score(X_embed[n_targets:,:2], y[n_targets:]))
print("similarity approximation - mse: %f" % check_similarity_match(X_embed[:,:2], S)[0])


knn accuracy: 1.000000
similarity approximation - mse: 0.069408

Lets first try a simple linear SimEc.


In [8]:
# similarity encoder with similarities relying on class information - linear
simec = SimilarityEncoder(X.shape[1], 2, n_targets, l2_reg_emb=0.00001, l2_reg_out=0.0000001, 
                          s_ll_reg=0.5, S_ll=S[:n_targets,:n_targets], opt=keras.optimizers.Adamax(lr=0.005))
simec.fit(X, S[:,:n_targets])
X_embed = simec.transform(X)
plot_digits(X_embed, digits, title='Digits - SimEc (class sim, linear)')
# of course we're overfitting here quite a bit since we used all samples for training
# even if we didn't use the corresponding similarities...but this is only a toy example anyways
clf = KNN(n_neighbors=10)
clf.fit(X_embed[:n_targets], y[:n_targets])
print("knn accuracy: %f" % clf.score(X_embed[n_targets:], y[n_targets:]))
print("similarity approximation - mse: %f" % check_similarity_match(X_embed, S)[0])


Epoch 1/25
1797/1797 [==============================] - 1s 386us/step - loss: 0.1301
Epoch 2/25
1797/1797 [==============================] - 0s 92us/step - loss: 0.1203
Epoch 3/25
1797/1797 [==============================] - 0s 99us/step - loss: 0.1175
Epoch 4/25
1797/1797 [==============================] - 0s 84us/step - loss: 0.1163
Epoch 5/25
1797/1797 [==============================] - 0s 84us/step - loss: 0.1151
Epoch 6/25
1797/1797 [==============================] - 0s 85us/step - loss: 0.1141
Epoch 7/25
1797/1797 [==============================] - 0s 91us/step - loss: 0.1129
Epoch 8/25
1797/1797 [==============================] - 0s 89us/step - loss: 0.1118
Epoch 9/25
1797/1797 [==============================] - 0s 100us/step - loss: 0.1109
Epoch 10/25
1797/1797 [==============================] - 0s 93us/step - loss: 0.1102
Epoch 11/25
1797/1797 [==============================] - 0s 95us/step - loss: 0.1097
Epoch 12/25
1797/1797 [==============================] - 0s 100us/step - loss: 0.1094
Epoch 13/25
1797/1797 [==============================] - 0s 94us/step - loss: 0.1092
Epoch 14/25
1797/1797 [==============================] - 0s 95us/step - loss: 0.1091
Epoch 15/25
1797/1797 [==============================] - 0s 97us/step - loss: 0.1090
Epoch 16/25
1797/1797 [==============================] - 0s 95us/step - loss: 0.1089
Epoch 17/25
1797/1797 [==============================] - 0s 94us/step - loss: 0.1089
Epoch 18/25
1797/1797 [==============================] - 0s 90us/step - loss: 0.1088
Epoch 19/25
1797/1797 [==============================] - 0s 85us/step - loss: 0.1088
Epoch 20/25
1797/1797 [==============================] - 0s 92us/step - loss: 0.1088
Epoch 21/25
1797/1797 [==============================] - 0s 88us/step - loss: 0.1087
Epoch 22/25
1797/1797 [==============================] - 0s 88us/step - loss: 0.1087
Epoch 23/25
1797/1797 [==============================] - 0s 85us/step - loss: 0.1087
Epoch 24/25
1797/1797 [==============================] - 0s 95us/step - loss: 0.1087
Epoch 25/25
1797/1797 [==============================] - 0s 93us/step - loss: 0.1086
knn accuracy: 0.698871
similarity approximation - mse: 0.076450

Great, we already see some clusters separating from the rest! What if we add more layers?

We can examine how the embedding changes during training: first some clusters separate, then it starts to look like the eigenvalue based embedding with the clusters of several numbers pulled together.


In [9]:
# similarity encoder with similarities relying on class information - 1 hidden layer
n_targets = 1000
simec = SimilarityEncoder(X.shape[1], 2, n_targets, hidden_layers=[(100, 'tanh')], 
                          l2_reg=0.00000001, l2_reg_emb=0.00001, l2_reg_out=0.0000001, 
                          s_ll_reg=0.5, S_ll=S[:n_targets,:n_targets], opt=keras.optimizers.Adamax(lr=0.01))
e_total = 0
for e in [5, 10, 10, 10, 15, 25, 25]:
    e_total += e
    print(e_total)
    simec.fit(X, S[:,:n_targets], epochs=e)
    X_embed = simec.transform(X)
    clf = KNN(n_neighbors=10)
    clf.fit(X_embed[:1000], y[:1000])
    acc = clf.score(X_embed[1000:], y[1000:])
    print("knn accuracy: %f" % acc)
    print("similarity approximation - mse: %f" % check_similarity_match(X_embed, S)[0])
    plot_digits(X_embed, digits, title='SimEc after %i epochs; accuracy: %.1f' % (e_total, 100*acc) , plot_box=False)


5
Epoch 1/5
1797/1797 [==============================] - 0s 185us/step - loss: 0.1232
Epoch 2/5
1797/1797 [==============================] - 0s 113us/step - loss: 0.1172
Epoch 3/5
1797/1797 [==============================] - 0s 114us/step - loss: 0.1161
Epoch 4/5
1797/1797 [==============================] - 0s 111us/step - loss: 0.1147
Epoch 5/5
1797/1797 [==============================] - 0s 102us/step - loss: 0.1129
knn accuracy: 0.670013
similarity approximation - mse: 0.114144
15
Epoch 1/10
1797/1797 [==============================] - 0s 116us/step - loss: 0.1109
Epoch 2/10
1797/1797 [==============================] - 0s 111us/step - loss: 0.1092
Epoch 3/10
1797/1797 [==============================] - 0s 102us/step - loss: 0.1081
Epoch 4/10
1797/1797 [==============================] - 0s 102us/step - loss: 0.1076
Epoch 5/10
1797/1797 [==============================] - 0s 104us/step - loss: 0.1073
Epoch 6/10
1797/1797 [==============================] - 0s 107us/step - loss: 0.1071
Epoch 7/10
1797/1797 [==============================] - 0s 113us/step - loss: 0.1069
Epoch 8/10
1797/1797 [==============================] - 0s 107us/step - loss: 0.1068
Epoch 9/10
1797/1797 [==============================] - 0s 120us/step - loss: 0.1067
Epoch 10/10
1797/1797 [==============================] - 0s 122us/step - loss: 0.1065
knn accuracy: 0.818068
similarity approximation - mse: 0.072731
25
Epoch 1/10
1797/1797 [==============================] - 0s 107us/step - loss: 0.1064
Epoch 2/10
1797/1797 [==============================] - 0s 103us/step - loss: 0.1064
Epoch 3/10
1797/1797 [==============================] - 0s 109us/step - loss: 0.1063
Epoch 4/10
1797/1797 [==============================] - 0s 109us/step - loss: 0.1062
Epoch 5/10
1797/1797 [==============================] - 0s 102us/step - loss: 0.1062
Epoch 6/10
1797/1797 [==============================] - 0s 99us/step - loss: 0.1061
Epoch 7/10
1797/1797 [==============================] - 0s 102us/step - loss: 0.1061
Epoch 8/10
1797/1797 [==============================] - 0s 101us/step - loss: 0.1060
Epoch 9/10
1797/1797 [==============================] - 0s 99us/step - loss: 0.1060
Epoch 10/10
1797/1797 [==============================] - 0s 98us/step - loss: 0.1060
knn accuracy: 0.902133
similarity approximation - mse: 0.071673
35
Epoch 1/10
1797/1797 [==============================] - 0s 106us/step - loss: 0.1059
Epoch 2/10
1797/1797 [==============================] - 0s 107us/step - loss: 0.1058
Epoch 3/10
1797/1797 [==============================] - 0s 111us/step - loss: 0.1058
Epoch 4/10
1797/1797 [==============================] - 0s 107us/step - loss: 0.1058
Epoch 5/10
1797/1797 [==============================] - 0s 109us/step - loss: 0.1058
Epoch 6/10
1797/1797 [==============================] - 0s 108us/step - loss: 0.1057
Epoch 7/10
1797/1797 [==============================] - 0s 106us/step - loss: 0.1056
Epoch 8/10
1797/1797 [==============================] - 0s 108us/step - loss: 0.1057
Epoch 9/10
1797/1797 [==============================] - 0s 105us/step - loss: 0.1056
Epoch 10/10
1797/1797 [==============================] - 0s 104us/step - loss: 0.1057
knn accuracy: 0.953576
similarity approximation - mse: 0.071307
50
Epoch 1/15
1797/1797 [==============================] - 0s 106us/step - loss: 0.1055
Epoch 2/15
1797/1797 [==============================] - 0s 108us/step - loss: 0.1055
Epoch 3/15
1797/1797 [==============================] - 0s 109us/step - loss: 0.1055
Epoch 4/15
1797/1797 [==============================] - 0s 101us/step - loss: 0.1054
Epoch 5/15
1797/1797 [==============================] - 0s 98us/step - loss: 0.1055
Epoch 6/15
1797/1797 [==============================] - 0s 94us/step - loss: 0.1054
Epoch 7/15
1797/1797 [==============================] - 0s 110us/step - loss: 0.1054
Epoch 8/15
1797/1797 [==============================] - 0s 101us/step - loss: 0.1053
Epoch 9/15
1797/1797 [==============================] - 0s 110us/step - loss: 0.1053
Epoch 10/15
1797/1797 [==============================] - 0s 105us/step - loss: 0.1053
Epoch 11/15
1797/1797 [==============================] - 0s 105us/step - loss: 0.1053
Epoch 12/15
1797/1797 [==============================] - 0s 103us/step - loss: 0.1053
Epoch 13/15
1797/1797 [==============================] - 0s 98us/step - loss: 0.1052
Epoch 14/15
1797/1797 [==============================] - 0s 97us/step - loss: 0.1052
Epoch 15/15
1797/1797 [==============================] - 0s 110us/step - loss: 0.1052
knn accuracy: 0.933501
similarity approximation - mse: 0.071060
75
Epoch 1/25
1797/1797 [==============================] - 0s 100us/step - loss: 0.1052
Epoch 2/25
1797/1797 [==============================] - 0s 116us/step - loss: 0.1052
Epoch 3/25
1797/1797 [==============================] - 0s 115us/step - loss: 0.1052
Epoch 4/25
1797/1797 [==============================] - 0s 120us/step - loss: 0.1052
Epoch 5/25
1797/1797 [==============================] - 0s 107us/step - loss: 0.1051
Epoch 6/25
1797/1797 [==============================] - 0s 100us/step - loss: 0.1051
Epoch 7/25
1797/1797 [==============================] - 0s 97us/step - loss: 0.1051
Epoch 8/25
1797/1797 [==============================] - 0s 98us/step - loss: 0.1051
Epoch 9/25
1797/1797 [==============================] - 0s 104us/step - loss: 0.1051
Epoch 10/25
1797/1797 [==============================] - 0s 107us/step - loss: 0.1051
Epoch 11/25
1797/1797 [==============================] - 0s 106us/step - loss: 0.1050
Epoch 12/25
1797/1797 [==============================] - 0s 87us/step - loss: 0.1050
Epoch 13/25
1797/1797 [==============================] - 0s 101us/step - loss: 0.1050
Epoch 14/25
1797/1797 [==============================] - 0s 116us/step - loss: 0.1050
Epoch 15/25
1797/1797 [==============================] - 0s 105us/step - loss: 0.1050
Epoch 16/25
1797/1797 [==============================] - 0s 103us/step - loss: 0.1050
Epoch 17/25
1797/1797 [==============================] - 0s 99us/step - loss: 0.1049
Epoch 18/25
1797/1797 [==============================] - 0s 102us/step - loss: 0.1049
Epoch 19/25
1797/1797 [==============================] - 0s 92us/step - loss: 0.1050
Epoch 20/25
1797/1797 [==============================] - 0s 97us/step - loss: 0.1049
Epoch 21/25
1797/1797 [==============================] - 0s 100us/step - loss: 0.1049
Epoch 22/25
1797/1797 [==============================] - 0s 97us/step - loss: 0.1049
Epoch 23/25
1797/1797 [==============================] - 0s 97us/step - loss: 0.1048
Epoch 24/25
1797/1797 [==============================] - 0s 104us/step - loss: 0.1049
Epoch 25/25
1797/1797 [==============================] - 0s 108us/step - loss: 0.1049
knn accuracy: 0.924718
similarity approximation - mse: 0.070712
100
Epoch 1/25
1797/1797 [==============================] - 0s 100us/step - loss: 0.1048
Epoch 2/25
1797/1797 [==============================] - 0s 100us/step - loss: 0.1048
Epoch 3/25
1797/1797 [==============================] - 0s 115us/step - loss: 0.1048
Epoch 4/25
1797/1797 [==============================] - 0s 117us/step - loss: 0.1047
Epoch 5/25
1797/1797 [==============================] - 0s 106us/step - loss: 0.1047
Epoch 6/25
1797/1797 [==============================] - 0s 97us/step - loss: 0.1047
Epoch 7/25
1797/1797 [==============================] - 0s 96us/step - loss: 0.1047
Epoch 8/25
1797/1797 [==============================] - 0s 99us/step - loss: 0.1047
Epoch 9/25
1797/1797 [==============================] - 0s 100us/step - loss: 0.1047
Epoch 10/25
1797/1797 [==============================] - 0s 96us/step - loss: 0.1047
Epoch 11/25
1797/1797 [==============================] - 0s 92us/step - loss: 0.1047
Epoch 12/25
1797/1797 [==============================] - 0s 96us/step - loss: 0.1047
Epoch 13/25
1797/1797 [==============================] - 0s 99us/step - loss: 0.1047
Epoch 14/25
1797/1797 [==============================] - 0s 106us/step - loss: 0.1046
Epoch 15/25
1797/1797 [==============================] - 0s 109us/step - loss: 0.1046
Epoch 16/25
1797/1797 [==============================] - 0s 106us/step - loss: 0.1046
Epoch 17/25
1797/1797 [==============================] - 0s 114us/step - loss: 0.1047
Epoch 18/25
1797/1797 [==============================] - 0s 114us/step - loss: 0.1046
Epoch 19/25
1797/1797 [==============================] - 0s 107us/step - loss: 0.1046
Epoch 20/25
1797/1797 [==============================] - 0s 111us/step - loss: 0.1046
Epoch 21/25
1797/1797 [==============================] - 0s 107us/step - loss: 0.1046
Epoch 22/25
1797/1797 [==============================] - 0s 106us/step - loss: 0.1046
Epoch 23/25
1797/1797 [==============================] - 0s 109us/step - loss: 0.1046
Epoch 24/25
1797/1797 [==============================] - 0s 104us/step - loss: 0.1046
Epoch 25/25
1797/1797 [==============================] - 0s 103us/step - loss: 0.1045
knn accuracy: 0.853199
similarity approximation - mse: 0.070510

MNIST Dataset

Embedding the regular 28x28 pixel MNIST digits


In [10]:
# load digits
mnist = fetch_mldata('MNIST original', data_home='data')
X = mnist.data/255.  # normalize to 0-1
y = np.array(mnist.target, dtype=int)
# subsample 10000 random data points
np.random.seed(42)
n_samples = 10000
n_test = 2000
n_targets = 1000
rnd_idx = np.random.permutation(X.shape[0])[:n_samples]
X_test, y_test = X[rnd_idx[:n_test],:], y[rnd_idx[:n_test]]
X, y = X[rnd_idx[n_test:],:], y[rnd_idx[n_test:]]
# scale
ss = StandardScaler(with_std=False)
X = ss.fit_transform(X)
X_test = ss.transform(X_test)
n_train, n_features = X.shape

In [11]:
# compute similarity matrix based on class labels
Y = np.tile(y, (len(y), 1))
S = center_K(np.array(Y==Y.T, dtype=int))
Y = np.tile(y_test, (len(y_test), 1))
S_test = center_K(np.array(Y==Y.T, dtype=int))

"Kernel PCA" and Ridge Regression vs. SimEc

To get an idea of how a perfect similarity preserving embedding would look like when computing similarities from class labels, we can embed the data by performing an eigendecomposition of the similarity matrix (i.e. performing kernel PCA). However, since in a real setting we would be unable to compute the similarities of the test samples to the training samples (since we don't know their class labels), to map the test samples into the embedding space we additionally need to train a (ridge) regression model to map from the original input space to the embedding space.

A SimEc with multiple hidden layers starts to get close to the eigendecomposition solution.


In [12]:
D, V = np.linalg.eig(S)
# as a comparison: regular kpca embedding: take largest EV
D1, V1 = D[np.argsort(D)[::-1]], V[:,np.argsort(D)[::-1]]
X_embed = np.dot(V1.real, np.diag(np.sqrt(np.abs(D1.real))))
plot_mnist(X_embed[:,:2], y, title='MNIST (train) - largest 2 EV')
print("similarity approximation  2D - mse: %f" % check_similarity_match(X_embed[:,:2], S)[0])
print("similarity approximation  5D - mse: %f" % check_similarity_match(X_embed[:,:5], S)[0])
print("similarity approximation  7D - mse: %f" % check_similarity_match(X_embed[:,:7], S)[0])
print("similarity approximation 10D - mse: %f" % check_similarity_match(X_embed[:,:10], S)[0])
print("similarity approximation 25D - mse: %f" % check_similarity_match(X_embed[:,:25], S)[0])


similarity approximation  2D - mse: 0.066420
similarity approximation  5D - mse: 0.035617
similarity approximation  7D - mse: 0.016659
similarity approximation 10D - mse: 0.000000
similarity approximation 25D - mse: 0.000000

In [13]:
n_targets = 2000
# get good alpha for RR model
m = Ridge()
rrm = GridSearchCV(m, {'alpha': [0.000001, 0.00001, 0.0001, 0.001, 0.01, 0.1, 0.25, 0.5, 0.75, 1., 2.5, 5., 7.5, 10., 25., 50., 75., 100., 250., 500., 750., 1000.]})
rrm.fit(X, X_embed[:,:8])
alpha = rrm.best_params_["alpha"]
print("Ridge Regression with alpha: %r" % alpha)
mse_ev, mse_rr, mse_rr_test = [], [], []
mse_simec, mse_simec_test = [], []
mse_simec_hl, mse_simec_hl_test = [], []
e_dims = [2, 3, 4, 5, 6, 7, 8, 9, 10, 15]
for e_dim in e_dims:
    print(e_dim)
    # eigenvalue based embedding
    mse = check_similarity_match(X_embed[:,:e_dim], S)[0]
    mse_ev.append(mse)
    # train a linear ridge regression model to learn the mapping from X to Y
    model = Ridge(alpha=alpha)
    model.fit(X, X_embed[:,:e_dim])
    X_embed_r = model.predict(X)
    X_embed_test_r = model.predict(X_test)
    mse = check_similarity_match(X_embed_r, S)[0]
    mse_rr.append(mse)
    mse = check_similarity_match(X_embed_test_r, S_test)[0]
    mse_rr_test.append(mse)
    # simec - linear
    simec = SimilarityEncoder(X.shape[1], e_dim, n_targets, s_ll_reg=0.5, S_ll=S[:n_targets,:n_targets],
                              orth_reg=0.001 if e_dim > 8 else 0., l2_reg_emb=0.00001, 
                              l2_reg_out=0.0000001, opt=keras.optimizers.Adamax(lr=0.001))
    simec.fit(X, S[:,:n_targets])
    X_embeds = simec.transform(X)
    X_embed_tests = simec.transform(X_test)
    mse = check_similarity_match(X_embeds, S)[0]
    mse_simec.append(mse)
    mse_t = check_similarity_match(X_embed_tests, S_test)[0]
    mse_simec_test.append(mse_t)
    # simec - 2hl
    simec = SimilarityEncoder(X.shape[1], e_dim, n_targets, hidden_layers=[(25, 'tanh'), (25, 'tanh')],
                              s_ll_reg=0.5, S_ll=S[:n_targets,:n_targets], orth_reg=0.001 if e_dim > 7 else 0., 
                              l2_reg=0., l2_reg_emb=0.00001, l2_reg_out=0.0000001, opt=keras.optimizers.Adamax(lr=0.001))
    simec.fit(X, S[:,:n_targets])
    X_embeds = simec.transform(X)
    X_embed_tests = simec.transform(X_test)
    mse = check_similarity_match(X_embeds, S)[0]
    mse_simec_hl.append(mse)
    mse_t = check_similarity_match(X_embed_tests, S_test)[0]
    mse_simec_hl_test.append(mse_t)
    print("mse ev: %f; mse rr: %f (%f); mse simec (0hl): %f (%f); mse simec (2hl): %f (%f)" % (mse_ev[-1], mse_rr[-1], mse_rr_test[-1], mse_simec[-1], mse_simec_test[-1], mse, mse_t))
keras.backend.clear_session()
colors = get_colors(15)
plt.figure();
plt.plot(e_dims, mse_ev, '-o', markersize=3, c=colors[14], label='Eigendecomposition');
plt.plot(e_dims, mse_rr, '-o', markersize=3, c=colors[12], label='ED + Regression');
plt.plot(e_dims, mse_rr_test, '--o', markersize=3, c=colors[12], label='ED + Regression (test)');
plt.plot(e_dims, mse_simec, '-o', markersize=3, c=colors[8], label='SimEc 0hl');
plt.plot(e_dims, mse_simec_test, '--o', markersize=3, c=colors[8], label='SimEc 0hl (test)');
plt.plot(e_dims, mse_simec_hl, '-o', markersize=3, c=colors[4], label='SimEc 2hl');
plt.plot(e_dims, mse_simec_hl_test, '--o', markersize=3, c=colors[4], label='SimEc 2hl (test)');
plt.legend(loc=0);
plt.title('MNIST (class based similarities)');
plt.plot([0, e_dims[-1]], [0,0], 'k--', linewidth=0.5);
plt.xticks(e_dims, e_dims);
plt.xlabel('Number of Embedding Dimensions ($d$)')
plt.ylabel('Mean Squared Error of $\hat{S}$')
print("e_dims=", e_dims)
print("mse_ev=", mse_ev)
print("mse_rr=", mse_rr)
print("mse_rr_test=", mse_rr_test)
print("mse_simec=", mse_simec)
print("mse_simec_test=", mse_simec_test)
print("mse_simec_hl=", mse_simec_hl)
print("mse_simec_hl_test=", mse_simec_hl_test)
if savefigs: plt.savefig('fig_class_mse_edim.pdf', dpi=300)


Ridge Regression with alpha: 75.0
2
Epoch 1/25
8000/8000 [==============================] - 1s 176us/step - loss: 0.1282
Epoch 2/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.1199
Epoch 3/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.1188
Epoch 4/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.1182
Epoch 5/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.1177
Epoch 6/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.1173
Epoch 7/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.1168
Epoch 8/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.1163
Epoch 9/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.1157
Epoch 10/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.1150
Epoch 11/25
8000/8000 [==============================] - 1s 138us/step - loss: 0.1140
Epoch 12/25
8000/8000 [==============================] - 1s 139us/step - loss: 0.1128
Epoch 13/25
8000/8000 [==============================] - 1s 138us/step - loss: 0.1113
Epoch 14/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.1096
Epoch 15/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.1081
Epoch 16/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.1070
Epoch 17/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.1064
Epoch 18/25
8000/8000 [==============================] - 1s 139us/step - loss: 0.1061
Epoch 19/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.1059
Epoch 20/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.1059
Epoch 21/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.1058
Epoch 22/25
8000/8000 [==============================] - 1s 138us/step - loss: 0.1057
Epoch 23/25
8000/8000 [==============================] - 1s 139us/step - loss: 0.1057
Epoch 24/25
8000/8000 [==============================] - 1s 139us/step - loss: 0.1056
Epoch 25/25
8000/8000 [==============================] - 1s 138us/step - loss: 0.1056
Epoch 1/25
8000/8000 [==============================] - 2s 208us/step - loss: 0.1273
Epoch 2/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.1186
Epoch 3/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.1171
Epoch 4/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.1159
Epoch 5/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.1148
Epoch 6/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.1139
Epoch 7/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.1131
Epoch 8/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.1122
Epoch 9/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.1109
Epoch 10/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.1093
Epoch 11/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.1075
Epoch 12/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.1056
Epoch 13/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.1040
Epoch 14/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.1028
Epoch 15/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.1019
Epoch 16/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.1014
Epoch 17/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.1013
Epoch 18/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.1012
Epoch 19/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.1011
Epoch 20/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.1010
Epoch 21/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.1009
Epoch 22/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.1009
Epoch 23/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.1008
Epoch 24/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.1008
Epoch 25/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.1007
mse ev: 0.066420; mse rr: 0.077786 (0.079355); mse simec (0hl): 0.076568 (0.078448); mse simec (2hl): 0.069105 (0.072084)
3
Epoch 1/25
8000/8000 [==============================] - 1s 177us/step - loss: 0.1236
Epoch 2/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.1112
Epoch 3/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.1098
Epoch 4/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.1090
Epoch 5/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.1083
Epoch 6/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.1075
Epoch 7/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.1064
Epoch 8/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.1049
Epoch 9/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.1027
Epoch 10/25
8000/8000 [==============================] - 1s 138us/step - loss: 0.1001
Epoch 11/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0972
Epoch 12/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0948
Epoch 13/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0934
Epoch 14/25
8000/8000 [==============================] - 1s 138us/step - loss: 0.0928
Epoch 15/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0925
Epoch 16/25
8000/8000 [==============================] - 1s 139us/step - loss: 0.0923
Epoch 17/25
8000/8000 [==============================] - 1s 138us/step - loss: 0.0923
Epoch 18/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0922
Epoch 19/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0921
Epoch 20/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0921
Epoch 21/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0920
Epoch 22/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0920
Epoch 23/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0920
Epoch 24/25
8000/8000 [==============================] - 1s 138us/step - loss: 0.0919
Epoch 25/25
8000/8000 [==============================] - 1s 139us/step - loss: 0.0919
Epoch 1/25
8000/8000 [==============================] - 2s 207us/step - loss: 0.1223
Epoch 2/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.1107
Epoch 3/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.1087
Epoch 4/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.1075
Epoch 5/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.1065
Epoch 6/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.1054
Epoch 7/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.1042
Epoch 8/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.1028
Epoch 9/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.1009
Epoch 10/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0986
Epoch 11/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0962
Epoch 12/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0942
Epoch 13/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0927
Epoch 14/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0914
Epoch 15/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0902
Epoch 16/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0890
Epoch 17/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0879
Epoch 18/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0873
Epoch 19/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0870
Epoch 20/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0867
Epoch 21/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0865
Epoch 22/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0863
Epoch 23/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0861
Epoch 24/25
8000/8000 [==============================] - 1s 166us/step - loss: 0.0860
Epoch 25/25
8000/8000 [==============================] - 1s 166us/step - loss: 0.0858
mse ev: 0.055818; mse rr: 0.072826 (0.075059); mse simec (0hl): 0.071039 (0.074592); mse simec (2hl): 0.061360 (0.066978)
4
Epoch 1/25
8000/8000 [==============================] - 1s 179us/step - loss: 0.1224
Epoch 2/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.1058
Epoch 3/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.1033
Epoch 4/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.1020
Epoch 5/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.1009
Epoch 6/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0996
Epoch 7/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0979
Epoch 8/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0957
Epoch 9/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0928
Epoch 10/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0899
Epoch 11/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0873
Epoch 12/25
8000/8000 [==============================] - 1s 139us/step - loss: 0.0850
Epoch 13/25
8000/8000 [==============================] - 1s 145us/step - loss: 0.0830
Epoch 14/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0816
Epoch 15/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0809
Epoch 16/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0806
Epoch 17/25
8000/8000 [==============================] - 1s 139us/step - loss: 0.0804
Epoch 18/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0803
Epoch 19/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0802
Epoch 20/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0801
Epoch 21/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0801
Epoch 22/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0800
Epoch 23/25
8000/8000 [==============================] - ETA: 0s - loss: 0.079 - 1s 144us/step - loss: 0.0799
Epoch 24/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0799
Epoch 25/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0799
Epoch 1/25
8000/8000 [==============================] - 2s 222us/step - loss: 0.1192
Epoch 2/25
8000/8000 [==============================] - 1s 173us/step - loss: 0.1032
Epoch 3/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.1002
Epoch 4/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0984
Epoch 5/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0968
Epoch 6/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0951
Epoch 7/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0931
Epoch 8/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0903
Epoch 9/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0869
Epoch 10/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0832
Epoch 11/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0797
Epoch 12/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0768
Epoch 13/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0746
Epoch 14/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0731
Epoch 15/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0723
Epoch 16/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0718
Epoch 17/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0715
Epoch 18/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0713
Epoch 19/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0711
Epoch 20/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0709
Epoch 21/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0707
Epoch 22/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0706
Epoch 23/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0704
Epoch 24/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0703
Epoch 25/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0702
mse ev: 0.045566; mse rr: 0.068304 (0.071994); mse simec (0hl): 0.067128 (0.071838); mse simec (2hl): 0.052726 (0.060868)
5
Epoch 1/25
8000/8000 [==============================] - 1s 183us/step - loss: 0.1216
Epoch 2/25
8000/8000 [==============================] - ETA: 0s - loss: 0.102 - 1s 141us/step - loss: 0.1025
Epoch 3/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0984
Epoch 4/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0967
Epoch 5/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0951
Epoch 6/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0935
Epoch 7/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0913
Epoch 8/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0884
Epoch 9/25
8000/8000 [==============================] - 1s 137us/step - loss: 0.0847
Epoch 10/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0806
Epoch 11/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0767
Epoch 12/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0736
Epoch 13/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0715
Epoch 14/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0704
Epoch 15/25
8000/8000 [==============================] - 1s 139us/step - loss: 0.0699
Epoch 16/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0696
Epoch 17/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0694
Epoch 18/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0693
Epoch 19/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0691
Epoch 20/25
8000/8000 [==============================] - 1s 137us/step - loss: 0.0690
Epoch 21/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0689
Epoch 22/25
8000/8000 [==============================] - 1s 137us/step - loss: 0.0689
Epoch 23/25
8000/8000 [==============================] - 1s 138us/step - loss: 0.0688
Epoch 24/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0687
Epoch 25/25
8000/8000 [==============================] - 1s 138us/step - loss: 0.0687
Epoch 1/25
8000/8000 [==============================] - 2s 215us/step - loss: 0.1198
Epoch 2/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.1027
Epoch 3/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0983
Epoch 4/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0934
Epoch 5/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0902
Epoch 6/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0873
Epoch 7/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0843
Epoch 8/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0808
Epoch 9/25
8000/8000 [==============================] - 1s 166us/step - loss: 0.0769
Epoch 10/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0730
Epoch 11/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0696
Epoch 12/25
8000/8000 [==============================] - 1s 166us/step - loss: 0.0669
Epoch 13/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0654
Epoch 14/25
8000/8000 [==============================] - 1s 165us/step - loss: 0.0646
Epoch 15/25
8000/8000 [==============================] - 1s 164us/step - loss: 0.0639
Epoch 16/25
8000/8000 [==============================] - 1s 166us/step - loss: 0.0631
Epoch 17/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0622
Epoch 18/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0611
Epoch 19/25
8000/8000 [==============================] - 1s 165us/step - loss: 0.0599
Epoch 20/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0590
Epoch 21/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0583
Epoch 22/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0579
Epoch 23/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0576
Epoch 24/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0574
Epoch 25/25
8000/8000 [==============================] - 1s 166us/step - loss: 0.0572
mse ev: 0.035617; mse rr: 0.064545 (0.069153); mse simec (0hl): 0.063245 (0.068595); mse simec (2hl): 0.045876 (0.054888)
6
Epoch 1/25
8000/8000 [==============================] - 2s 191us/step - loss: 0.1196
Epoch 2/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0986
Epoch 3/25
8000/8000 [==============================] - 1s 145us/step - loss: 0.0928
Epoch 4/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0907
Epoch 5/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0889
Epoch 6/25
8000/8000 [==============================] - 1s 139us/step - loss: 0.0868
Epoch 7/25
8000/8000 [==============================] - 1s 139us/step - loss: 0.0841
Epoch 8/25
8000/8000 [==============================] - 1s 136us/step - loss: 0.0805
Epoch 9/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0760
Epoch 10/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0714
Epoch 11/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0678
Epoch 12/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0654
Epoch 13/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0639
Epoch 14/25
8000/8000 [==============================] - 1s 137us/step - loss: 0.0627
Epoch 15/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0617
Epoch 16/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0609
Epoch 17/25
8000/8000 [==============================] - 1s 138us/step - loss: 0.0604
Epoch 18/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0602
Epoch 19/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0600
Epoch 20/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0598
Epoch 21/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0598
Epoch 22/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0596
Epoch 23/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0596
Epoch 24/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0595
Epoch 25/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0594
Epoch 1/25
8000/8000 [==============================] - 2s 221us/step - loss: 0.1161
Epoch 2/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0954
Epoch 3/25
8000/8000 [==============================] - 1s 174us/step - loss: 0.0908
Epoch 4/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0858
Epoch 5/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0822
Epoch 6/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0789
Epoch 7/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0750
Epoch 8/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0704
Epoch 9/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0657
Epoch 10/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0611
Epoch 11/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0574
Epoch 12/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0548
Epoch 13/25
8000/8000 [==============================] - 1s 165us/step - loss: 0.0530
Epoch 14/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0514
Epoch 15/25
8000/8000 [==============================] - 1s 163us/step - loss: 0.0496
Epoch 16/25
8000/8000 [==============================] - 1s 165us/step - loss: 0.0480
Epoch 17/25
8000/8000 [==============================] - 1s 166us/step - loss: 0.0472
Epoch 18/25
8000/8000 [==============================] - 1s 164us/step - loss: 0.0467
Epoch 19/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0464
Epoch 20/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0461
Epoch 21/25
8000/8000 [==============================] - 1s 166us/step - loss: 0.0458
Epoch 22/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0456
Epoch 23/25
8000/8000 [==============================] - 1s 165us/step - loss: 0.0454
Epoch 24/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0452
Epoch 25/25
8000/8000 [==============================] - 1s 165us/step - loss: 0.0451
mse ev: 0.025953; mse rr: 0.062133 (0.066742); mse simec (0hl): 0.060223 (0.065595); mse simec (2hl): 0.039686 (0.049538)
7
Epoch 1/25
8000/8000 [==============================] - 2s 198us/step - loss: 0.1159
Epoch 2/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0947
Epoch 3/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0889
Epoch 4/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0860
Epoch 5/25
8000/8000 [==============================] - 1s 145us/step - loss: 0.0837
Epoch 6/25
8000/8000 [==============================] - 1s 145us/step - loss: 0.0809
Epoch 7/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0774
Epoch 8/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0729
Epoch 9/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0681
Epoch 10/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0636
Epoch 11/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0598
Epoch 12/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0570
Epoch 13/25
8000/8000 [==============================] - 1s 145us/step - loss: 0.0551
Epoch 14/25
8000/8000 [==============================] - 1s 136us/step - loss: 0.0538
Epoch 15/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0530
Epoch 16/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0526
Epoch 17/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0523
Epoch 18/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0522
Epoch 19/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0520
Epoch 20/25
8000/8000 [==============================] - 1s 148us/step - loss: 0.0519
Epoch 21/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0517
Epoch 22/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0516
Epoch 23/25
8000/8000 [==============================] - 1s 136us/step - loss: 0.0516
Epoch 24/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0515
Epoch 25/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0514
Epoch 1/25
8000/8000 [==============================] - 2s 229us/step - loss: 0.1149
Epoch 2/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0921
Epoch 3/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0862
Epoch 4/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0816
Epoch 5/25
8000/8000 [==============================] - 1s 155us/step - loss: 0.0764
Epoch 6/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0720
Epoch 7/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0675
Epoch 8/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0621
Epoch 9/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0564
Epoch 10/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0516
Epoch 11/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0480
Epoch 12/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0453
Epoch 13/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0432
Epoch 14/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0415
Epoch 15/25
8000/8000 [==============================] - 1s 165us/step - loss: 0.0400
Epoch 16/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0388
Epoch 17/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0381
Epoch 18/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0375
Epoch 19/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0372
Epoch 20/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0369
Epoch 21/25
8000/8000 [==============================] - 1s 173us/step - loss: 0.0366
Epoch 22/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0363
Epoch 23/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0361
Epoch 24/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0358
Epoch 25/25
8000/8000 [==============================] - 1s 165us/step - loss: 0.0356
mse ev: 0.016659; mse rr: 0.059272 (0.064194); mse simec (0hl): 0.058193 (0.063741); mse simec (2hl): 0.035547 (0.045200)
8
Epoch 1/25
8000/8000 [==============================] - 2s 202us/step - loss: 0.1156
Epoch 2/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0947
Epoch 3/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0879
Epoch 4/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0842
Epoch 5/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0804
Epoch 6/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0767
Epoch 7/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0722
Epoch 8/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0670
Epoch 9/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0618
Epoch 10/25
8000/8000 [==============================] - 1s 139us/step - loss: 0.0571
Epoch 11/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0532
Epoch 12/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0501
Epoch 13/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0479
Epoch 14/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0466
Epoch 15/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0459
Epoch 16/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0456
Epoch 17/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0452
Epoch 18/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0450
Epoch 19/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0449
Epoch 20/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0447
Epoch 21/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0446
Epoch 22/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0445
Epoch 23/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0444
Epoch 24/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0443
Epoch 25/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0442
Epoch 1/25
8000/8000 [==============================] - 2s 234us/step - loss: 0.1141
Epoch 2/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0921
Epoch 3/25
8000/8000 [==============================] - 1s 165us/step - loss: 0.0861
Epoch 4/25
8000/8000 [==============================] - 1s 166us/step - loss: 0.0807
Epoch 5/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0763
Epoch 6/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0713
Epoch 7/25
8000/8000 [==============================] - 1s 166us/step - loss: 0.0648
Epoch 8/25
8000/8000 [==============================] - 1s 166us/step - loss: 0.0569
Epoch 9/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0487
Epoch 10/25
8000/8000 [==============================] - 1s 164us/step - loss: 0.0416
Epoch 11/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0364
Epoch 12/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0329
Epoch 13/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0308
Epoch 14/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0297
Epoch 15/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0290
Epoch 16/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0285
Epoch 17/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0280
Epoch 18/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0276
Epoch 19/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0272
Epoch 20/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0269
Epoch 21/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0266
Epoch 22/25
8000/8000 [==============================] - 1s 165us/step - loss: 0.0262
Epoch 23/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0260
Epoch 24/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0257
Epoch 25/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0254
mse ev: 0.008156; mse rr: 0.057297 (0.062027); mse simec (0hl): 0.056627 (0.062615); mse simec (2hl): 0.031655 (0.041997)
9
Epoch 1/25
8000/8000 [==============================] - 2s 208us/step - loss: 0.1163
Epoch 2/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0938
Epoch 3/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0866
Epoch 4/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0835
Epoch 5/25
8000/8000 [==============================] - 1s 146us/step - loss: 0.0807
Epoch 6/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0771
Epoch 7/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0729
Epoch 8/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0678
Epoch 9/25
8000/8000 [==============================] - 1s 138us/step - loss: 0.0612
Epoch 10/25
8000/8000 [==============================] - 1s 145us/step - loss: 0.0534
Epoch 11/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0465
Epoch 12/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0422
Epoch 13/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0403
Epoch 14/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0394
Epoch 15/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0389
Epoch 16/25
8000/8000 [==============================] - 1s 138us/step - loss: 0.0386
Epoch 17/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0384
Epoch 18/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0381
Epoch 19/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0380
Epoch 20/25
8000/8000 [==============================] - 1s 146us/step - loss: 0.0378
Epoch 21/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0377
Epoch 22/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0376
Epoch 23/25
8000/8000 [==============================] - 1s 148us/step - loss: 0.0375
Epoch 24/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0373
Epoch 25/25
8000/8000 [==============================] - 1s 146us/step - loss: 0.0373
Epoch 1/25
8000/8000 [==============================] - 2s 244us/step - loss: 0.1133
Epoch 2/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0924
Epoch 3/25
8000/8000 [==============================] - 1s 165us/step - loss: 0.0853
Epoch 4/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0788
Epoch 5/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0734
Epoch 6/25
8000/8000 [==============================] - 1s 173us/step - loss: 0.0677
Epoch 7/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0618
Epoch 8/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0554
Epoch 9/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0494
Epoch 10/25
8000/8000 [==============================] - 1s 174us/step - loss: 0.0439
Epoch 11/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0384
Epoch 12/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0322
Epoch 13/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0271
Epoch 14/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0238
Epoch 15/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0217
Epoch 16/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0202
Epoch 17/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0193
Epoch 18/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0186
Epoch 19/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0182
Epoch 20/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0178
Epoch 21/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0173
Epoch 22/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0170
Epoch 23/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0167
Epoch 24/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0165
Epoch 25/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0162
mse ev: 0.000000; mse rr: 0.055404 (0.060183); mse simec (0hl): 0.055726 (0.062458); mse simec (2hl): 0.028071 (0.038846)
10
Epoch 1/25
8000/8000 [==============================] - 2s 216us/step - loss: 0.1134
Epoch 2/25
8000/8000 [==============================] - 1s 147us/step - loss: 0.0921
Epoch 3/25
8000/8000 [==============================] - 1s 149us/step - loss: 0.0858
Epoch 4/25
8000/8000 [==============================] - 1s 145us/step - loss: 0.0815
Epoch 5/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0787
Epoch 6/25
8000/8000 [==============================] - 1s 146us/step - loss: 0.0758
Epoch 7/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0721
Epoch 8/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0671
Epoch 9/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0605
Epoch 10/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0530
Epoch 11/25
8000/8000 [==============================] - 1s 145us/step - loss: 0.0466
Epoch 12/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0426
Epoch 13/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0404
Epoch 14/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0394
Epoch 15/25
8000/8000 [==============================] - 1s 145us/step - loss: 0.0389
Epoch 16/25
8000/8000 [==============================] - 1s 147us/step - loss: 0.0385
Epoch 17/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0383
Epoch 18/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0381
Epoch 19/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0379
Epoch 20/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0377
Epoch 21/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0376
Epoch 22/25
8000/8000 [==============================] - 1s 146us/step - loss: 0.0375
Epoch 23/25
8000/8000 [==============================] - 1s 145us/step - loss: 0.0374
Epoch 24/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0373
Epoch 25/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0372
Epoch 1/25
8000/8000 [==============================] - 2s 247us/step - loss: 0.1131
Epoch 2/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0878
Epoch 3/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0801
Epoch 4/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0745
Epoch 5/25
8000/8000 [==============================] - 1s 174us/step - loss: 0.0687
Epoch 6/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0623
Epoch 7/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0550
Epoch 8/25
8000/8000 [==============================] - 1s 173us/step - loss: 0.0466
Epoch 9/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0381
Epoch 10/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0312
Epoch 11/25
8000/8000 [==============================] - 1s 168us/step - loss: 0.0267
Epoch 12/25
8000/8000 [==============================] - 1s 167us/step - loss: 0.0240
Epoch 13/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0222
Epoch 14/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0206
Epoch 15/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0196
Epoch 16/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0188
Epoch 17/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0183
Epoch 18/25
8000/8000 [==============================] - 1s 173us/step - loss: 0.0179
Epoch 19/25
8000/8000 [==============================] - 1s 169us/step - loss: 0.0175
Epoch 20/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0171
Epoch 21/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0168
Epoch 22/25
8000/8000 [==============================] - 1s 165us/step - loss: 0.0165
Epoch 23/25
8000/8000 [==============================] - 1s 166us/step - loss: 0.0162
Epoch 24/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0159
Epoch 25/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0156
mse ev: 0.000000; mse rr: 0.055404 (0.060183); mse simec (0hl): 0.055790 (0.062326); mse simec (2hl): 0.027150 (0.037158)
15
Epoch 1/25
8000/8000 [==============================] - 2s 222us/step - loss: 0.1086
Epoch 2/25
8000/8000 [==============================] - 1s 146us/step - loss: 0.0875
Epoch 3/25
8000/8000 [==============================] - 1s 148us/step - loss: 0.0829
Epoch 4/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0809
Epoch 5/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0791
Epoch 6/25
8000/8000 [==============================] - 1s 146us/step - loss: 0.0769
Epoch 7/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0739
Epoch 8/25
8000/8000 [==============================] - 1s 149us/step - loss: 0.0698
Epoch 9/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0648
Epoch 10/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0597
Epoch 11/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0547
Epoch 12/25
8000/8000 [==============================] - 1s 143us/step - loss: 0.0505
Epoch 13/25
8000/8000 [==============================] - 1s 147us/step - loss: 0.0472
Epoch 14/25
8000/8000 [==============================] - 1s 147us/step - loss: 0.0450
Epoch 15/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0433
Epoch 16/25
8000/8000 [==============================] - 1s 145us/step - loss: 0.0419
Epoch 17/25
8000/8000 [==============================] - 1s 145us/step - loss: 0.0406
Epoch 18/25
8000/8000 [==============================] - 1s 145us/step - loss: 0.0395
Epoch 19/25
8000/8000 [==============================] - 1s 142us/step - loss: 0.0387
Epoch 20/25
8000/8000 [==============================] - 1s 141us/step - loss: 0.0381
Epoch 21/25
8000/8000 [==============================] - 1s 146us/step - loss: 0.0379
Epoch 22/25
8000/8000 [==============================] - 1s 147us/step - loss: 0.0376
Epoch 23/25
8000/8000 [==============================] - 1s 146us/step - loss: 0.0375
Epoch 24/25
8000/8000 [==============================] - 1s 140us/step - loss: 0.0374
Epoch 25/25
8000/8000 [==============================] - 1s 144us/step - loss: 0.0373
Epoch 1/25
8000/8000 [==============================] - 2s 254us/step - loss: 0.1091
Epoch 2/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0877
Epoch 3/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0797
Epoch 4/25
8000/8000 [==============================] - 1s 174us/step - loss: 0.0729
Epoch 5/25
8000/8000 [==============================] - 1s 175us/step - loss: 0.0678
Epoch 6/25
8000/8000 [==============================] - 1s 166us/step - loss: 0.0626
Epoch 7/25
8000/8000 [==============================] - 1s 174us/step - loss: 0.0562
Epoch 8/25
8000/8000 [==============================] - 1s 175us/step - loss: 0.0485
Epoch 9/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0412
Epoch 10/25
8000/8000 [==============================] - 1s 173us/step - loss: 0.0356
Epoch 11/25
8000/8000 [==============================] - 1s 173us/step - loss: 0.0314
Epoch 12/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0281
Epoch 13/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0257
Epoch 14/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0238
Epoch 15/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0224
Epoch 16/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0210
Epoch 17/25
8000/8000 [==============================] - 1s 175us/step - loss: 0.0198
Epoch 18/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0190
Epoch 19/25
8000/8000 [==============================] - 1s 171us/step - loss: 0.0185
Epoch 20/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0180
Epoch 21/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0176
Epoch 22/25
8000/8000 [==============================] - 1s 174us/step - loss: 0.0172
Epoch 23/25
8000/8000 [==============================] - 1s 170us/step - loss: 0.0169
Epoch 24/25
8000/8000 [==============================] - 1s 172us/step - loss: 0.0166
Epoch 25/25
8000/8000 [==============================] - 1s 173us/step - loss: 0.0163
mse ev: 0.000000; mse rr: 0.055404 (0.060183); mse simec (0hl): 0.055743 (0.062151); mse simec (2hl): 0.028811 (0.038648)
e_dims= [2, 3, 4, 5, 6, 7, 8, 9, 10, 15]
mse_ev= [0.066419542359582293, 0.055817708926609436, 0.045566146426609618, 0.03561653059674811, 0.025952881207718905, 0.016658888345945339, 0.0081564788795163434, 5.6042560830257886e-27, 5.604119942053966e-27, 5.6039813227498701e-27]
mse_rr= [0.077785515221909782, 0.072825996063333842, 0.068303733602481528, 0.064544647408199707, 0.062132954275257987, 0.059271531912452512, 0.057297454174229032, 0.055403880119418193, 0.055403880119418221, 0.055403880119418221]
mse_rr_test= [0.079355114740819022, 0.075058628769985855, 0.071993964074924865, 0.069153214970815557, 0.066742286814929733, 0.064193869503442441, 0.062026821418338915, 0.060183319509258877, 0.060183319509258933, 0.060183319509258926]
mse_simec= [0.076568050678279334, 0.071038980567454404, 0.067128304292360386, 0.063245236108794844, 0.060222930046918748, 0.058192553110024713, 0.05662710067087004, 0.055726182579776323, 0.055790319068616971, 0.05574320211739401]
mse_simec_test= [0.078447802313442491, 0.074592321031535333, 0.071838303643982596, 0.068595366968850796, 0.065594924397536197, 0.063741306080176044, 0.062614608606356231, 0.062458093870562129, 0.062326449524390932, 0.062150975789054375]
mse_simec_hl= [0.069105497929186688, 0.061359526869828512, 0.052726047060978033, 0.045876387152250481, 0.039686136020634775, 0.035547455212484334, 0.031655299189843708, 0.028070700325343811, 0.027149612964296608, 0.02881112202343359]
mse_simec_hl_test= [0.072083991964546387, 0.066977961723614102, 0.060867678486780513, 0.05488791443997694, 0.049537591128842859, 0.045200275258449364, 0.041996551018341571, 0.038845816921540759, 0.03715755993004817, 0.038647911590194171]

20 Newsgroups

To show that SimEc embeddings can also be computed for other types of data, we do some further experiments with the 20 newsgroups dataset. We subsample 7 of the 20 categories and remove meta information such as headers to avoid overfitting (see also http://scikit-learn.org/stable/datasets/twenty_newsgroups.html). The posts are transformed into very high dimensional tf-idf vectors used as input to the SimEc and to compute the linear kernel matrix.


In [14]:
## load the data and transform it into a tf-idf representation
categories = [
    "comp.graphics",
    "rec.autos",
    "rec.sport.baseball",
    "sci.med",
    "sci.space",
    "soc.religion.christian",
    "talk.politics.guns"
]
newsgroups_train = fetch_20newsgroups(subset='train', remove=(
    'headers', 'footers', 'quotes'), data_home='data', categories=categories, random_state=42)
newsgroups_test = fetch_20newsgroups(subset='test', remove=(
    'headers', 'footers', 'quotes'), data_home='data', categories=categories, random_state=42)
# store in dicts (if the text contains more than 3 words)
textdict = {i: t for i, t in enumerate(newsgroups_train.data) if len(t.split()) > 3}
textdict.update({i: t for i, t in enumerate(newsgroups_test.data, len(newsgroups_train.data)) if len(t.split()) > 3})
train_ids = [i for i in range(len(newsgroups_train.data)) if i in textdict]
test_ids = [i for i in range(len(newsgroups_train.data), len(textdict)) if i in textdict]
print("%i training and %i test samples" % (len(train_ids), len(test_ids)))
# transform into tf-idf features
ft = FeatureTransform(norm='max', weight=True, renorm='max')
docfeats = ft.texts2features(textdict, fit_ids=train_ids)
# organize in feature matrix
X, featurenames = features2mat(docfeats, train_ids)
X_test, _ = features2mat(docfeats, test_ids, featurenames)
print("%i features" % len(featurenames))
targets = np.hstack([newsgroups_train.target,newsgroups_test.target])
y = targets[train_ids]
y_test = targets[test_ids]
n_targets = 1000
target_names = newsgroups_train.target_names


3959 training and 2359 test samples
45813 features

In [15]:
# compute label based simmat
Y = np.tile(y, (len(y), 1))
S = center_K(np.array(Y==Y.T, dtype=int))
Y = np.tile(y_test, (len(y_test), 1))
S_test = center_K(np.array(Y==Y.T, dtype=int))
D, V = np.linalg.eig(S)
# as a comparison: regular kpca embedding: take largest EV
D1, V1 = D[np.argsort(D)[::-1]], V[:,np.argsort(D)[::-1]]
X_embed = np.dot(V1.real, np.diag(np.sqrt(np.abs(D1.real))))
plot_20news(X_embed[:, :2], y, target_names, title='20 newsgroups - 2 largest EV', legend=True)
print("similarity approximation  2D - mse: %f" % check_similarity_match(X_embed[:,:2], S)[0])
print("similarity approximation  5D - mse: %f" % check_similarity_match(X_embed[:,:5], S)[0])
print("similarity approximation  7D - mse: %f" % check_similarity_match(X_embed[:,:7], S)[0])
print("similarity approximation 10D - mse: %f" % check_similarity_match(X_embed[:,:10], S)[0])
print("similarity approximation 25D - mse: %f" % check_similarity_match(X_embed[:,:25], S)[0])


similarity approximation  2D - mse: 0.079464
similarity approximation  5D - mse: 0.018388
similarity approximation  7D - mse: 0.000000
similarity approximation 10D - mse: 0.000000
similarity approximation 25D - mse: 0.000000

In [16]:
n_targets = 2000
# get good alpha for RR model
m = Ridge()
rrm = GridSearchCV(m, {'alpha': [0.000001, 0.00001, 0.0001, 0.001, 0.01, 0.1, 0.25, 0.5, 0.75, 1., 2.5, 5., 7.5, 10., 25., 50., 75., 100., 250., 500., 750., 1000.]})
rrm.fit(X, X_embed[:,:8])
alpha = rrm.best_params_["alpha"]
print("Ridge Regression with alpha: %r" % alpha)
mse_ev, mse_rr, mse_rr_test = [], [], []
mse_simec, mse_simec_test = [], []
mse_simec_hl, mse_simec_hl_test = [], []
e_dims = [2, 3, 4, 5, 6, 7, 8, 9, 10]
for e_dim in e_dims:
    print(e_dim)
    # eigenvalue based embedding
    mse = check_similarity_match(X_embed[:,:e_dim], S)[0]
    mse_ev.append(mse)
    # train a linear ridge regression model to learn the mapping from X to Y
    model = Ridge(alpha=alpha)
    model.fit(X, X_embed[:,:e_dim])
    X_embed_r = model.predict(X)
    X_embed_test_r = model.predict(X_test)
    mse = check_similarity_match(X_embed_r, S)[0]
    mse_rr.append(mse)
    mse = check_similarity_match(X_embed_test_r, S_test)[0]
    mse_rr_test.append(mse)
    # simec - linear
    simec = SimilarityEncoder(X.shape[1], e_dim, n_targets, s_ll_reg=0.5, S_ll=S[:n_targets,:n_targets],
                              sparse_inputs=True, orth_reg=0.1 if e_dim > 6 else 0., l2_reg_emb=0.0001, 
                              l2_reg_out=0.00001, opt=keras.optimizers.Adamax(lr=0.01))
    simec.fit(X, S[:,:n_targets])
    X_embeds = simec.transform(X)
    X_embed_tests = simec.transform(X_test)
    mse = check_similarity_match(X_embeds, S)[0]
    mse_simec.append(mse)
    mse_t = check_similarity_match(X_embed_tests, S_test)[0]
    mse_simec_test.append(mse_t)
    # simec - 2hl
    simec = SimilarityEncoder(X.shape[1], e_dim, n_targets, hidden_layers=[(25, 'tanh'), (25, 'tanh')], sparse_inputs=True,
                              s_ll_reg=1., S_ll=S[:n_targets,:n_targets], orth_reg=0.1 if e_dim > 7 else 0., 
                              l2_reg=0., l2_reg_emb=0.01, l2_reg_out=0.00001, opt=keras.optimizers.Adamax(lr=0.01))
    simec.fit(X, S[:,:n_targets])
    X_embeds = simec.transform(X)
    X_embed_tests = simec.transform(X_test)
    mse = check_similarity_match(X_embeds, S)[0]
    mse_simec_hl.append(mse)
    mse_t = check_similarity_match(X_embed_tests, S_test)[0]
    mse_simec_hl_test.append(mse_t)
    print("mse ev: %f; mse rr: %f (%f); mse simec (0hl): %f (%f); mse simec (2hl): %f (%f)" % (mse_ev[-1], mse_rr[-1], mse_rr_test[-1], mse_simec[-1], mse_simec_test[-1], mse, mse_t))
keras.backend.clear_session()
colors = get_colors(15)
plt.figure();
plt.plot(e_dims, mse_ev, '-o', markersize=3, c=colors[14], label='Eigendecomposition');
plt.plot(e_dims, mse_rr, '-o', markersize=3, c=colors[12], label='ED + Regression');
plt.plot(e_dims, mse_rr_test, '--o', markersize=3, c=colors[12], label='ED + Regression (test)');
plt.plot(e_dims, mse_simec, '-o', markersize=3, c=colors[8], label='SimEc 0hl');
plt.plot(e_dims, mse_simec_test, '--o', markersize=3, c=colors[8], label='SimEc 0hl (test)');
plt.plot(e_dims, mse_simec_hl, '-o', markersize=3, c=colors[4], label='SimEc 2hl');
plt.plot(e_dims, mse_simec_hl_test, '--o', markersize=3, c=colors[4], label='SimEc 2hl (test)');
plt.legend(bbox_to_anchor=(1.02, 1), loc=2, borderaxespad=0.);
plt.title('20 newsgroups (class based similarities)');
plt.plot([0, e_dims[-1]], [0,0], 'k--', linewidth=0.5);
plt.xticks(e_dims, e_dims);
plt.xlabel('Number of Embedding Dimensions ($d$)')
plt.ylabel('Mean Squared Error of $\hat{S}$')
print("e_dims=", e_dims)
print("mse_ev=", mse_ev)
print("mse_rr=", mse_rr)
print("mse_rr_test=", mse_rr_test)
print("mse_simec=", mse_simec)
print("mse_simec_test=", mse_simec_test)
print("mse_simec_hl=", mse_simec_hl)
print("mse_simec_hl_test=", mse_simec_hl_test)


Ridge Regression with alpha: 2.5
2
Epoch 1/25
3959/3959 [==============================] - 3s 642us/step - loss: 0.1624
Epoch 2/25
3959/3959 [==============================] - 1s 193us/step - loss: 0.1421
Epoch 3/25
3959/3959 [==============================] - 1s 189us/step - loss: 0.1386
Epoch 4/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.1373
Epoch 5/25
3959/3959 [==============================] - 1s 194us/step - loss: 0.1367
Epoch 6/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.1363
Epoch 7/25
3959/3959 [==============================] - 1s 200us/step - loss: 0.1362
Epoch 8/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.1360
Epoch 9/25
3959/3959 [==============================] - 1s 188us/step - loss: 0.1359
Epoch 10/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.1358
Epoch 11/25
3959/3959 [==============================] - 1s 192us/step - loss: 0.1357
Epoch 12/25
3959/3959 [==============================] - 1s 186us/step - loss: 0.1355
Epoch 13/25
3959/3959 [==============================] - 1s 193us/step - loss: 0.1355
Epoch 14/25
3959/3959 [==============================] - 1s 192us/step - loss: 0.1353
Epoch 15/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.1352
Epoch 16/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.1351
Epoch 17/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.1350
Epoch 18/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.1349
Epoch 19/25
3959/3959 [==============================] - 1s 196us/step - loss: 0.1348
Epoch 20/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.1347
Epoch 21/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.1346
Epoch 22/25
3959/3959 [==============================] - 1s 185us/step - loss: 0.1345
Epoch 23/25
3959/3959 [==============================] - 1s 191us/step - loss: 0.1345
Epoch 24/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.1344
Epoch 25/25
3959/3959 [==============================] - 1s 196us/step - loss: 0.1344
Epoch 1/25
3959/3959 [==============================] - 3s 688us/step - loss: 0.2066
Epoch 2/25
3959/3959 [==============================] - 1s 246us/step - loss: 0.1717
Epoch 3/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.1690
Epoch 4/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.1682
Epoch 5/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.1678
Epoch 6/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.1676
Epoch 7/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.1674
Epoch 8/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.1671
Epoch 9/25
3959/3959 [==============================] - 1s 236us/step - loss: 0.1668
Epoch 10/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.1665
Epoch 11/25
3959/3959 [==============================] - 1s 237us/step - loss: 0.1663
Epoch 12/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.1660
Epoch 13/25
3959/3959 [==============================] - 1s 237us/step - loss: 0.1658
Epoch 14/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.1658
Epoch 15/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.1657
Epoch 16/25
3959/3959 [==============================] - 1s 237us/step - loss: 0.1657
Epoch 17/25
3959/3959 [==============================] - 1s 245us/step - loss: 0.1656
Epoch 18/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.1657
Epoch 19/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.1656
Epoch 20/25
3959/3959 [==============================] - 1s 245us/step - loss: 0.1656
Epoch 21/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.1656
Epoch 22/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.1656
Epoch 23/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.1655
Epoch 24/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.1655
Epoch 25/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.1655
mse ev: 0.079464; mse rr: 0.085028 (0.110218); mse simec (0hl): 0.087533 (0.107993); mse simec (2hl): 0.081201 (0.099688)
3
Epoch 1/25
3959/3959 [==============================] - 3s 651us/step - loss: 0.1555
Epoch 2/25
3959/3959 [==============================] - 1s 194us/step - loss: 0.1233
Epoch 3/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.1178
Epoch 4/25
3959/3959 [==============================] - 1s 196us/step - loss: 0.1159
Epoch 5/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.1149
Epoch 6/25
3959/3959 [==============================] - 1s 202us/step - loss: 0.1143
Epoch 7/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.1141
Epoch 8/25
3959/3959 [==============================] - 1s 190us/step - loss: 0.1138
Epoch 9/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.1137
Epoch 10/25
3959/3959 [==============================] - 1s 191us/step - loss: 0.1134
Epoch 11/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.1133
Epoch 12/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.1131
Epoch 13/25
3959/3959 [==============================] - 1s 191us/step - loss: 0.1129
Epoch 14/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.1128
Epoch 15/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.1125
Epoch 16/25
3959/3959 [==============================] - 1s 196us/step - loss: 0.1123
Epoch 17/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.1122
Epoch 18/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.1121
Epoch 19/25
3959/3959 [==============================] - 1s 192us/step - loss: 0.1120
Epoch 20/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.1118
Epoch 21/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.1118
Epoch 22/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.1117
Epoch 23/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.1116
Epoch 24/25
3959/3959 [==============================] - 1s 202us/step - loss: 0.1115
Epoch 25/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.1115
Epoch 1/25
3959/3959 [==============================] - 3s 707us/step - loss: 0.1959
Epoch 2/25
3959/3959 [==============================] - 1s 234us/step - loss: 0.1379
Epoch 3/25
3959/3959 [==============================] - 1s 232us/step - loss: 0.1327
Epoch 4/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.1310
Epoch 5/25
3959/3959 [==============================] - 1s 246us/step - loss: 0.1300
Epoch 6/25
3959/3959 [==============================] - 1s 237us/step - loss: 0.1293
Epoch 7/25
3959/3959 [==============================] - 1s 226us/step - loss: 0.1288
Epoch 8/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.1283
Epoch 9/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.1280
Epoch 10/25
3959/3959 [==============================] - 1s 236us/step - loss: 0.1278
Epoch 11/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.1276
Epoch 12/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.1275
Epoch 13/25
3959/3959 [==============================] - 1s 231us/step - loss: 0.1274
Epoch 14/25
3959/3959 [==============================] - 1s 237us/step - loss: 0.1274
Epoch 15/25
3959/3959 [==============================] - 1s 229us/step - loss: 0.1274
Epoch 16/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.1274
Epoch 17/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.1274
Epoch 18/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.1273
Epoch 19/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.1273
Epoch 20/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.1273
Epoch 21/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.1273
Epoch 22/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.1273
Epoch 23/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.1273
Epoch 24/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.1273
Epoch 25/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.1273
mse ev: 0.058687; mse rr: 0.067327 (0.104408); mse simec (0hl): 0.071293 (0.103113); mse simec (2hl): 0.060645 (0.091182)
4
Epoch 1/25
3959/3959 [==============================] - 3s 662us/step - loss: 0.1428
Epoch 2/25
3959/3959 [==============================] - 1s 192us/step - loss: 0.1015
Epoch 3/25
3959/3959 [==============================] - 1s 205us/step - loss: 0.0943
Epoch 4/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.0919
Epoch 5/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.0908
Epoch 6/25
3959/3959 [==============================] - 1s 202us/step - loss: 0.0901
Epoch 7/25
3959/3959 [==============================] - 1s 204us/step - loss: 0.0897
Epoch 8/25
3959/3959 [==============================] - 1s 194us/step - loss: 0.0894
Epoch 9/25
3959/3959 [==============================] - 1s 193us/step - loss: 0.0893
Epoch 10/25
3959/3959 [==============================] - 1s 190us/step - loss: 0.0890
Epoch 11/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0890
Epoch 12/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0887
Epoch 13/25
3959/3959 [==============================] - 1s 194us/step - loss: 0.0885
Epoch 14/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0884
Epoch 15/25
3959/3959 [==============================] - 1s 200us/step - loss: 0.0883
Epoch 16/25
3959/3959 [==============================] - 1s 196us/step - loss: 0.0883
Epoch 17/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0882
Epoch 18/25
3959/3959 [==============================] - 1s 215us/step - loss: 0.0881
Epoch 19/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0880
Epoch 20/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0880 0s 
Epoch 21/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0879
Epoch 22/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.0879
Epoch 23/25
3959/3959 [==============================] - 1s 203us/step - loss: 0.0879
Epoch 24/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0878
Epoch 25/25
3959/3959 [==============================] - 1s 187us/step - loss: 0.0879
Epoch 1/25
3959/3959 [==============================] - 3s 702us/step - loss: 0.1753
Epoch 2/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.1025
Epoch 3/25
3959/3959 [==============================] - 1s 245us/step - loss: 0.0967
Epoch 4/25
3959/3959 [==============================] - 1s 237us/step - loss: 0.0946
Epoch 5/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0935
Epoch 6/25
3959/3959 [==============================] - 1s 236us/step - loss: 0.0928
Epoch 7/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0923
Epoch 8/25
3959/3959 [==============================] - 1s 249us/step - loss: 0.0920
Epoch 9/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.0918
Epoch 10/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.0916
Epoch 11/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0914
Epoch 12/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.0914
Epoch 13/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.0913
Epoch 14/25
3959/3959 [==============================] - 1s 233us/step - loss: 0.0913
Epoch 15/25
3959/3959 [==============================] - 1s 236us/step - loss: 0.0912
Epoch 16/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0911
Epoch 17/25
3959/3959 [==============================] - 1s 236us/step - loss: 0.0911
Epoch 18/25
3959/3959 [==============================] - 1s 224us/step - loss: 0.0911
Epoch 19/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0911
Epoch 20/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.0911
Epoch 21/25
3959/3959 [==============================] - 1s 233us/step - loss: 0.0910
Epoch 22/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0910
Epoch 23/25
3959/3959 [==============================] - 1s 230us/step - loss: 0.0910
Epoch 24/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.0910
Epoch 25/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0911
mse ev: 0.038285; mse rr: 0.049976 (0.096880); mse simec (0hl): 0.051783 (0.097694); mse simec (2hl): 0.039308 (0.074846)
5
Epoch 1/25
3959/3959 [==============================] - 3s 675us/step - loss: 0.1323
Epoch 2/25
3959/3959 [==============================] - 1s 191us/step - loss: 0.0799
Epoch 3/25
3959/3959 [==============================] - 1s 193us/step - loss: 0.0711
Epoch 4/25
3959/3959 [==============================] - 1s 189us/step - loss: 0.0682
Epoch 5/25
3959/3959 [==============================] - 1s 186us/step - loss: 0.0668
Epoch 6/25
3959/3959 [==============================] - 1s 190us/step - loss: 0.0662
Epoch 7/25
3959/3959 [==============================] - 1s 190us/step - loss: 0.0658
Epoch 8/25
3959/3959 [==============================] - 1s 193us/step - loss: 0.0657
Epoch 9/25
3959/3959 [==============================] - 1s 191us/step - loss: 0.0658
Epoch 10/25
3959/3959 [==============================] - 1s 190us/step - loss: 0.0657
Epoch 11/25
3959/3959 [==============================] - 1s 194us/step - loss: 0.0656
Epoch 12/25
3959/3959 [==============================] - 1s 194us/step - loss: 0.0656
Epoch 13/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.0655
Epoch 14/25
3959/3959 [==============================] - 1s 191us/step - loss: 0.0655
Epoch 15/25
3959/3959 [==============================] - 1s 189us/step - loss: 0.0654
Epoch 16/25
3959/3959 [==============================] - 1s 188us/step - loss: 0.0654
Epoch 17/25
3959/3959 [==============================] - 1s 186us/step - loss: 0.0653
Epoch 18/25
3959/3959 [==============================] - 1s 189us/step - loss: 0.0653
Epoch 19/25
3959/3959 [==============================] - 1s 189us/step - loss: 0.0652
Epoch 20/25
3959/3959 [==============================] - 1s 188us/step - loss: 0.0653
Epoch 21/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.0653
Epoch 22/25
3959/3959 [==============================] - 1s 189us/step - loss: 0.0653
Epoch 23/25
3959/3959 [==============================] - 1s 194us/step - loss: 0.0652
Epoch 24/25
3959/3959 [==============================] - 1s 192us/step - loss: 0.0652
Epoch 25/25
3959/3959 [==============================] - 1s 203us/step - loss: 0.0652
Epoch 1/25
3959/3959 [==============================] - 3s 719us/step - loss: 0.1772
Epoch 2/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0872
Epoch 3/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.0646
Epoch 4/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0598
Epoch 5/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0579
Epoch 6/25
3959/3959 [==============================] - 1s 231us/step - loss: 0.0569
Epoch 7/25
3959/3959 [==============================] - 1s 233us/step - loss: 0.0563
Epoch 8/25
3959/3959 [==============================] - 1s 233us/step - loss: 0.0559
Epoch 9/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.0557
Epoch 10/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0555
Epoch 11/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0553
Epoch 12/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0552
Epoch 13/25
3959/3959 [==============================] - 1s 237us/step - loss: 0.0551
Epoch 14/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.0551
Epoch 15/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.0550
Epoch 16/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.0550
Epoch 17/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0550
Epoch 18/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0549
Epoch 19/25
3959/3959 [==============================] - 1s 234us/step - loss: 0.0549
Epoch 20/25
3959/3959 [==============================] - 1s 234us/step - loss: 0.0549
Epoch 21/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0549
Epoch 22/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0549
Epoch 23/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0549
Epoch 24/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0549
Epoch 25/25
3959/3959 [==============================] - 1s 236us/step - loss: 0.0548
mse ev: 0.018388; mse rr: 0.033179 (0.091087); mse simec (0hl): 0.035113 (0.092189); mse simec (2hl): 0.019714 (0.066781)
6
Epoch 1/25
3959/3959 [==============================] - 3s 696us/step - loss: 0.1312
Epoch 2/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.0643
Epoch 3/25
3959/3959 [==============================] - 1s 196us/step - loss: 0.0501
Epoch 4/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.0460
Epoch 5/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0443
Epoch 6/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0435
Epoch 7/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0431
Epoch 8/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0430
Epoch 9/25
3959/3959 [==============================] - 1s 192us/step - loss: 0.0431
Epoch 10/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0432
Epoch 11/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0430
Epoch 12/25
3959/3959 [==============================] - 1s 200us/step - loss: 0.0430
Epoch 13/25
3959/3959 [==============================] - 1s 202us/step - loss: 0.0430
Epoch 14/25
3959/3959 [==============================] - 1s 205us/step - loss: 0.0430
Epoch 15/25
3959/3959 [==============================] - 1s 210us/step - loss: 0.0429
Epoch 16/25
3959/3959 [==============================] - 1s 204us/step - loss: 0.0430
Epoch 17/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0429
Epoch 18/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0429
Epoch 19/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0429
Epoch 20/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0429
Epoch 21/25
3959/3959 [==============================] - 1s 194us/step - loss: 0.0429
Epoch 22/25
3959/3959 [==============================] - 1s 192us/step - loss: 0.0428
Epoch 23/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0429
Epoch 24/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0430
Epoch 25/25
3959/3959 [==============================] - 1s 192us/step - loss: 0.0429
Epoch 1/25
3959/3959 [==============================] - 3s 740us/step - loss: 0.1693
Epoch 2/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0474
Epoch 3/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0292
Epoch 4/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0247
Epoch 5/25
3959/3959 [==============================] - 1s 247us/step - loss: 0.0228
Epoch 6/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0218
Epoch 7/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.0211
Epoch 8/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.0207
Epoch 9/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.0203
Epoch 10/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0201
Epoch 11/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.0199
Epoch 12/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0197
Epoch 13/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.0196
Epoch 14/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0195
Epoch 15/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0195
Epoch 16/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.0194
Epoch 17/25
3959/3959 [==============================] - 1s 237us/step - loss: 0.0194
Epoch 18/25
3959/3959 [==============================] - 1s 237us/step - loss: 0.0193
Epoch 19/25
3959/3959 [==============================] - 1s 236us/step - loss: 0.0193
Epoch 20/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.0193
Epoch 21/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0192
Epoch 22/25
3959/3959 [==============================] - 1s 230us/step - loss: 0.0192
Epoch 23/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0192
Epoch 24/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.0192
Epoch 25/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0191
mse ev: 0.000000; mse rr: 0.017187 (0.085461); mse simec (0hl): 0.017848 (0.085673); mse simec (2hl): 0.000196 (0.053679)
7
Epoch 1/25
3959/3959 [==============================] - 3s 707us/step - loss: 0.1645
Epoch 2/25
3959/3959 [==============================] - 1s 203us/step - loss: 0.1021
Epoch 3/25
3959/3959 [==============================] - 1s 200us/step - loss: 0.0762
Epoch 4/25
3959/3959 [==============================] - 1s 202us/step - loss: 0.0646
Epoch 5/25
3959/3959 [==============================] - 1s 205us/step - loss: 0.0588
Epoch 6/25
3959/3959 [==============================] - 1s 202us/step - loss: 0.0557
Epoch 7/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.0540
Epoch 8/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.0529
Epoch 9/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0523
Epoch 10/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.0518
Epoch 11/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.0513
Epoch 12/25
3959/3959 [==============================] - 1s 194us/step - loss: 0.0509
Epoch 13/25
3959/3959 [==============================] - 1s 192us/step - loss: 0.0506
Epoch 14/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0502
Epoch 15/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0497
Epoch 16/25
3959/3959 [==============================] - 1s 204us/step - loss: 0.0493
Epoch 17/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0488
Epoch 18/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.0482
Epoch 19/25
3959/3959 [==============================] - 1s 200us/step - loss: 0.0476
Epoch 20/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0470
Epoch 21/25
3959/3959 [==============================] - 1s 208us/step - loss: 0.0465
Epoch 22/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.0459
Epoch 23/25
3959/3959 [==============================] - 1s 205us/step - loss: 0.0454
Epoch 24/25
3959/3959 [==============================] - 1s 194us/step - loss: 0.0450
Epoch 25/25
3959/3959 [==============================] - 1s 194us/step - loss: 0.0446
Epoch 1/25
3959/3959 [==============================] - 3s 754us/step - loss: 0.1634
Epoch 2/25
3959/3959 [==============================] - 1s 237us/step - loss: 0.0450
Epoch 3/25
3959/3959 [==============================] - 1s 234us/step - loss: 0.0288
Epoch 4/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0247
Epoch 5/25
3959/3959 [==============================] - 1s 245us/step - loss: 0.0228
Epoch 6/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0218
Epoch 7/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.0212
Epoch 8/25
3959/3959 [==============================] - 1s 250us/step - loss: 0.0207
Epoch 9/25
3959/3959 [==============================] - 1s 247us/step - loss: 0.0204
Epoch 10/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0200
Epoch 11/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0198
Epoch 12/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0196
Epoch 13/25
3959/3959 [==============================] - 1s 236us/step - loss: 0.0195
Epoch 14/25
3959/3959 [==============================] - 1s 250us/step - loss: 0.0194
Epoch 15/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.0194
Epoch 16/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.0194
Epoch 17/25
3959/3959 [==============================] - 1s 229us/step - loss: 0.0193
Epoch 18/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.0193
Epoch 19/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.0193
Epoch 20/25
3959/3959 [==============================] - 1s 237us/step - loss: 0.0193
Epoch 21/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0192
Epoch 22/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0192
Epoch 23/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0192
Epoch 24/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0192
Epoch 25/25
3959/3959 [==============================] - 1s 246us/step - loss: 0.0192
mse ev: 0.000000; mse rr: 0.017199 (0.085457); mse simec (0hl): 0.017515 (0.085433); mse simec (2hl): 0.000209 (0.053658)
8
Epoch 1/25
3959/3959 [==============================] - 3s 712us/step - loss: 0.1582
Epoch 2/25
3959/3959 [==============================] - 1s 204us/step - loss: 0.0958
Epoch 3/25
3959/3959 [==============================] - 1s 206us/step - loss: 0.0726
Epoch 4/25
3959/3959 [==============================] - 1s 206us/step - loss: 0.0623
Epoch 5/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.0575
Epoch 6/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0550
Epoch 7/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.0537
Epoch 8/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.0527
Epoch 9/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.0522
Epoch 10/25
3959/3959 [==============================] - 1s 206us/step - loss: 0.0516
Epoch 11/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.0510
Epoch 12/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.0504
Epoch 13/25
3959/3959 [==============================] - 1s 191us/step - loss: 0.0497
Epoch 14/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0490
Epoch 15/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.0482
Epoch 16/25
3959/3959 [==============================] - 1s 200us/step - loss: 0.0475
Epoch 17/25
3959/3959 [==============================] - 1s 200us/step - loss: 0.0466
Epoch 18/25
3959/3959 [==============================] - 1s 200us/step - loss: 0.0459
Epoch 19/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.0454
Epoch 20/25
3959/3959 [==============================] - 1s 196us/step - loss: 0.0448
Epoch 21/25
3959/3959 [==============================] - 1s 200us/step - loss: 0.0444
Epoch 22/25
3959/3959 [==============================] - 1s 202us/step - loss: 0.0440
Epoch 23/25
3959/3959 [==============================] - 1s 204us/step - loss: 0.0439
Epoch 24/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0437
Epoch 25/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.0436
Epoch 1/25
3959/3959 [==============================] - 3s 766us/step - loss: 0.2210
Epoch 2/25
3959/3959 [==============================] - 1s 234us/step - loss: 0.1107
Epoch 3/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.0555
Epoch 4/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0366
Epoch 5/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0296
Epoch 6/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.0259
Epoch 7/25
3959/3959 [==============================] - 1s 236us/step - loss: 0.0238
Epoch 8/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0226
Epoch 9/25
3959/3959 [==============================] - 1s 233us/step - loss: 0.0218
Epoch 10/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.0213
Epoch 11/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0209
Epoch 12/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0208
Epoch 13/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.0206
Epoch 14/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0205
Epoch 15/25
3959/3959 [==============================] - 1s 234us/step - loss: 0.0205
Epoch 16/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0205
Epoch 17/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0204
Epoch 18/25
3959/3959 [==============================] - 1s 237us/step - loss: 0.0204
Epoch 19/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0203
Epoch 20/25
3959/3959 [==============================] - 1s 245us/step - loss: 0.0203
Epoch 21/25
3959/3959 [==============================] - 1s 235us/step - loss: 0.0203
Epoch 22/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0203
Epoch 23/25
3959/3959 [==============================] - 1s 233us/step - loss: 0.0203
Epoch 24/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0202
Epoch 25/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0203
mse ev: 0.000000; mse rr: 0.017199 (0.085458); mse simec (0hl): 0.017935 (0.085985); mse simec (2hl): 0.000314 (0.053995)
9
Epoch 1/25
3959/3959 [==============================] - 3s 728us/step - loss: 0.1621
Epoch 2/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.0983
Epoch 3/25
3959/3959 [==============================] - 1s 203us/step - loss: 0.0735
Epoch 4/25
3959/3959 [==============================] - 1s 208us/step - loss: 0.0632
Epoch 5/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.0584
Epoch 6/25
3959/3959 [==============================] - 1s 211us/step - loss: 0.0560
Epoch 7/25
3959/3959 [==============================] - 1s 203us/step - loss: 0.0546
Epoch 8/25
3959/3959 [==============================] - 1s 200us/step - loss: 0.0538
Epoch 9/25
3959/3959 [==============================] - 1s 195us/step - loss: 0.0532
Epoch 10/25
3959/3959 [==============================] - 1s 202us/step - loss: 0.0528
Epoch 11/25
3959/3959 [==============================] - 1s 203us/step - loss: 0.0524
Epoch 12/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.0520
Epoch 13/25
3959/3959 [==============================] - 1s 200us/step - loss: 0.0517
Epoch 14/25
3959/3959 [==============================] - 1s 205us/step - loss: 0.0511
Epoch 15/25
3959/3959 [==============================] - 1s 196us/step - loss: 0.0505
Epoch 16/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.0498
Epoch 17/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0489
Epoch 18/25
3959/3959 [==============================] - 1s 196us/step - loss: 0.0481
Epoch 19/25
3959/3959 [==============================] - 1s 210us/step - loss: 0.0471
Epoch 20/25
3959/3959 [==============================] - 1s 197us/step - loss: 0.0463
Epoch 21/25
3959/3959 [==============================] - 1s 200us/step - loss: 0.0455
Epoch 22/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0449
Epoch 23/25
3959/3959 [==============================] - 1s 202us/step - loss: 0.0444
Epoch 24/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0440
Epoch 25/25
3959/3959 [==============================] - 1s 205us/step - loss: 0.0436
Epoch 1/25
3959/3959 [==============================] - 3s 783us/step - loss: 0.2149
Epoch 2/25
3959/3959 [==============================] - 1s 247us/step - loss: 0.0875
Epoch 3/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0403
Epoch 4/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0281
Epoch 5/25
3959/3959 [==============================] - 1s 249us/step - loss: 0.0242
Epoch 6/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0226
Epoch 7/25
3959/3959 [==============================] - 1s 245us/step - loss: 0.0218
Epoch 8/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0214
Epoch 9/25
3959/3959 [==============================] - 1s 246us/step - loss: 0.0210
Epoch 10/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.0207
Epoch 11/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.0206 0
Epoch 12/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0204
Epoch 13/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.0203
Epoch 14/25
3959/3959 [==============================] - 1s 233us/step - loss: 0.0202
Epoch 15/25
3959/3959 [==============================] - 1s 236us/step - loss: 0.0202
Epoch 16/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0201
Epoch 17/25
3959/3959 [==============================] - 1s 238us/step - loss: 0.0200
Epoch 18/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0201
Epoch 19/25
3959/3959 [==============================] - 1s 246us/step - loss: 0.0200
Epoch 20/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.0200
Epoch 21/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.0200
Epoch 22/25
3959/3959 [==============================] - 1s 248us/step - loss: 0.0200
Epoch 23/25
3959/3959 [==============================] - 1s 245us/step - loss: 0.0200
Epoch 24/25
3959/3959 [==============================] - 1s 234us/step - loss: 0.0199
Epoch 25/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.0200
mse ev: 0.000000; mse rr: 0.017201 (0.085460); mse simec (0hl): 0.018266 (0.085965); mse simec (2hl): 0.000326 (0.054106)
10
Epoch 1/25
3959/3959 [==============================] - 3s 754us/step - loss: 0.1531
Epoch 2/25
3959/3959 [==============================] - 1s 211us/step - loss: 0.0955
Epoch 3/25
3959/3959 [==============================] - 1s 200us/step - loss: 0.0772
Epoch 4/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.0687
Epoch 5/25
3959/3959 [==============================] - 1s 207us/step - loss: 0.0636
Epoch 6/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.0598
Epoch 7/25
3959/3959 [==============================] - 1s 204us/step - loss: 0.0565
Epoch 8/25
3959/3959 [==============================] - 1s 203us/step - loss: 0.0536
Epoch 9/25
3959/3959 [==============================] - 1s 199us/step - loss: 0.0509
Epoch 10/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.0487
Epoch 11/25
3959/3959 [==============================] - 1s 202us/step - loss: 0.0469
Epoch 12/25
3959/3959 [==============================] - 1s 207us/step - loss: 0.0456
Epoch 13/25
3959/3959 [==============================] - 1s 208us/step - loss: 0.0449
Epoch 14/25
3959/3959 [==============================] - 1s 198us/step - loss: 0.0442
Epoch 15/25
3959/3959 [==============================] - 1s 203us/step - loss: 0.0439
Epoch 16/25
3959/3959 [==============================] - 1s 205us/step - loss: 0.0437
Epoch 17/25
3959/3959 [==============================] - 1s 202us/step - loss: 0.0435
Epoch 18/25
3959/3959 [==============================] - 1s 201us/step - loss: 0.0434
Epoch 19/25
3959/3959 [==============================] - 1s 210us/step - loss: 0.0433
Epoch 20/25
3959/3959 [==============================] - 1s 209us/step - loss: 0.0434
Epoch 21/25
3959/3959 [==============================] - 1s 205us/step - loss: 0.0433
Epoch 22/25
3959/3959 [==============================] - 1s 205us/step - loss: 0.0432
Epoch 23/25
3959/3959 [==============================] - 1s 206us/step - loss: 0.0432
Epoch 24/25
3959/3959 [==============================] - 1s 205us/step - loss: 0.0432
Epoch 25/25
3959/3959 [==============================] - 1s 202us/step - loss: 0.0431
Epoch 1/25
3959/3959 [==============================] - 3s 797us/step - loss: 0.2049
Epoch 2/25
3959/3959 [==============================] - 1s 246us/step - loss: 0.0832
Epoch 3/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.0478
Epoch 4/25
3959/3959 [==============================] - 1s 234us/step - loss: 0.0344
Epoch 5/25
3959/3959 [==============================] - 1s 240us/step - loss: 0.0272
Epoch 6/25
3959/3959 [==============================] - 1s 245us/step - loss: 0.0235
Epoch 7/25
3959/3959 [==============================] - 1s 245us/step - loss: 0.0218
Epoch 8/25
3959/3959 [==============================] - 1s 236us/step - loss: 0.0210
Epoch 9/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0205
Epoch 10/25
3959/3959 [==============================] - 1s 245us/step - loss: 0.0203
Epoch 11/25
3959/3959 [==============================] - 1s 239us/step - loss: 0.0201
Epoch 12/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.0200
Epoch 13/25
3959/3959 [==============================] - 1s 234us/step - loss: 0.0199
Epoch 14/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0198
Epoch 15/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0198
Epoch 16/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.0197
Epoch 17/25
3959/3959 [==============================] - 1s 247us/step - loss: 0.0197
Epoch 18/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0196
Epoch 19/25
3959/3959 [==============================] - 1s 241us/step - loss: 0.0196
Epoch 20/25
3959/3959 [==============================] - 1s 242us/step - loss: 0.0196
Epoch 21/25
3959/3959 [==============================] - 1s 243us/step - loss: 0.0196
Epoch 22/25
3959/3959 [==============================] - 1s 244us/step - loss: 0.0196
Epoch 23/25
3959/3959 [==============================] - 1s 236us/step - loss: 0.0196
Epoch 24/25
3959/3959 [==============================] - 1s 236us/step - loss: 0.0196
Epoch 25/25
3959/3959 [==============================] - 1s 245us/step - loss: 0.0195
mse ev: 0.000000; mse rr: 0.017208 (0.085455); mse simec (0hl): 0.017807 (0.086144); mse simec (2hl): 0.000463 (0.053031)
e_dims= [2, 3, 4, 5, 6, 7, 8, 9, 10]
mse_ev= [0.079464455624083954, 0.058687039046107572, 0.038285110041301994, 0.018388250034337913, 2.6538365259635093e-26, 2.653836904505227e-26, 2.6538297145150965e-26, 2.6538292763372156e-26, 2.6538292947522621e-26]
mse_rr= [0.085028030832504736, 0.067326611108990478, 0.049975511642569158, 0.033178763214808171, 0.017186762133309635, 0.017199435774713015, 0.017199300059710845, 0.017200737643222792, 0.017208486279029277]
mse_rr_test= [0.11021762292457779, 0.10440775366448585, 0.096880033044653008, 0.09108691843807093, 0.085460686677100009, 0.085456521123557783, 0.085457932785752269, 0.085460393160600348, 0.085455345771455804]
mse_simec= [0.08753334350622724, 0.071293130217505676, 0.051783069863455169, 0.035112512416313184, 0.017847528230360503, 0.017515305706111775, 0.017935028875923624, 0.01826579391791797, 0.017806815295476543]
mse_simec_test= [0.10799251849317386, 0.10311345084554424, 0.097693630432529016, 0.092189486743296217, 0.085673350543041912, 0.08543305150879292, 0.085984963146631135, 0.085965204859459771, 0.086143724260019511]
mse_simec_hl= [0.081200967462767976, 0.060644826118850824, 0.039308321304160329, 0.019713573377145091, 0.00019633875848988706, 0.00020949235412545835, 0.00031426001543514141, 0.00032586283541852419, 0.00046287297898347905]
mse_simec_hl_test= [0.099687948424341602, 0.091181512685112223, 0.074846027181633537, 0.066781081197467951, 0.053678724835647673, 0.05365846429001106, 0.053995172653991258, 0.054105811334050416, 0.05303061484356205]

In [ ]: