Train a basic TensorFlow Lite for Microcontrollers model

This notebook demonstrates the process of training a 2.5 kB model using TensorFlow and converting it for use with TensorFlow Lite for Microcontrollers.

Deep learning networks learn to model patterns in underlying data. Here, we're going to train a network to model data generated by a sine function. This will result in a model that can take a value, x, and predict its sine, y.

The model created in this notebook is used in the hello_world example for TensorFlow Lite for MicroControllers.

View source on GitHub

Training is much faster using GPU acceleration. Before you proceed, ensure you are using a GPU runtime by going to Runtime -> Change runtime type and set Hardware accelerator: GPU.

Configure Defaults


In [0]:
# Define paths to model files
import os
MODELS_DIR = 'models/'
if not os.path.exists(MODELS_DIR):
    os.mkdir(MODELS_DIR)
MODEL_TF = MODELS_DIR + 'model.pb'
MODEL_NO_QUANT_TFLITE = MODELS_DIR + 'model_no_quant.tflite'
MODEL_TFLITE = MODELS_DIR + 'model.tflite'
MODEL_TFLITE_MICRO = MODELS_DIR + 'model.cc'

Setup Environment

Install Dependencies


In [2]:
! pip install -q tensorflow==2


     |████████████████████████████████| 86.3MB 52kB/s 
     |████████████████████████████████| 450kB 46.2MB/s 
     |████████████████████████████████| 3.8MB 50.3MB/s 
  Building wheel for gast (setup.py) ... done

Set Seed for Repeatable Results


In [0]:
# Set a "seed" value, so we get the same random numbers each time we run this
# notebook for reproducible results.
# Numpy is a math library
import numpy as np
np.random.seed(1) # numpy seed
# TensorFlow is an open source machine learning library
import tensorflow as tf
tf.random.set_seed(1) # tensorflow global random seed

Import Dependencies


In [0]:
# Keras is TensorFlow's high-level API for deep learning
from tensorflow import keras
# Matplotlib is a graphing library
import matplotlib.pyplot as plt
# Math is Python's math library
import math

Dataset

1. Generate Data

The code in the following cell will generate a set of random x values, calculate their sine values, and display them on a graph.


In [5]:
# Number of sample datapoints
SAMPLES = 1000

# Generate a uniformly distributed set of random numbers in the range from
# 0 to 2π, which covers a complete sine wave oscillation
x_values = np.random.uniform(
    low=0, high=2*math.pi, size=SAMPLES).astype(np.float32)

# Shuffle the values to guarantee they're not in order
np.random.shuffle(x_values)

# Calculate the corresponding sine values
y_values = np.sin(x_values).astype(np.float32)

# Plot our data. The 'b.' argument tells the library to print blue dots.
plt.plot(x_values, y_values, 'b.')
plt.show()


2. Add Noise

Since it was generated directly by the sine function, our data fits a nice, smooth curve.

However, machine learning models are good at extracting underlying meaning from messy, real world data. To demonstrate this, we can add some noise to our data to approximate something more life-like.

In the following cell, we'll add some random noise to each value, then draw a new graph:


In [6]:
# Add a small random number to each y value
y_values += 0.1 * np.random.randn(*y_values.shape)

# Plot our data
plt.plot(x_values, y_values, 'b.')
plt.show()


3. Split the Data

We now have a noisy dataset that approximates real world data. We'll be using this to train our model.

To evaluate the accuracy of the model we train, we'll need to compare its predictions to real data and check how well they match up. This evaluation happens during training (where it is referred to as validation) and after training (referred to as testing) It's important in both cases that we use fresh data that was not already used to train the model.

The data is split as follows:

  1. Training: 60%
  2. Validation: 20%
  3. Testing: 20%

The following code will split our data and then plots each set as a different color:


In [7]:
# We'll use 60% of our data for training and 20% for testing. The remaining 20%
# will be used for validation. Calculate the indices of each section.
TRAIN_SPLIT =  int(0.6 * SAMPLES)
TEST_SPLIT = int(0.2 * SAMPLES + TRAIN_SPLIT)

# Use np.split to chop our data into three parts.
# The second argument to np.split is an array of indices where the data will be
# split. We provide two indices, so the data will be divided into three chunks.
x_train, x_test, x_validate = np.split(x_values, [TRAIN_SPLIT, TEST_SPLIT])
y_train, y_test, y_validate = np.split(y_values, [TRAIN_SPLIT, TEST_SPLIT])

# Double check that our splits add up correctly
assert (x_train.size + x_validate.size + x_test.size) ==  SAMPLES

# Plot the data in each partition in different colors:
plt.plot(x_train, y_train, 'b.', label="Train")
plt.plot(x_test, y_test, 'r.', label="Test")
plt.plot(x_validate, y_validate, 'y.', label="Validate")
plt.legend()
plt.show()


Training

1. Design the Model

We're going to build a simple neural network model that will take an input value (in this case, x) and use it to predict a numeric output value (the sine of x). This type of problem is called a regression. It will use layers of neurons to attempt to learn any patterns underlying the training data, so it can make predictions.

To begin with, we'll define two layers. The first layer takes a single input (our x value) and runs it through 8 neurons. Based on this input, each neuron will become activated to a certain degree based on its internal state (its weight and bias values). A neuron's degree of activation is expressed as a number.

The activation numbers from our first layer will be fed as inputs to our second layer, which is a single neuron. It will apply its own weights and bias to these inputs and calculate its own activation, which will be output as our y value.

Note: To learn more about how neural networks function, you can explore the Learn TensorFlow codelabs.

The code in the following cell defines our model using Keras, TensorFlow's high-level API for creating deep learning networks. Once the network is defined, we compile it, specifying parameters that determine how it will be trained:


In [0]:
# We'll use Keras to create a simple model architecture
model_1 = tf.keras.Sequential()

# First layer takes a scalar input and feeds it through 8 "neurons". The
# neurons decide whether to activate based on the 'relu' activation function.
model_1.add(keras.layers.Dense(8, activation='relu', input_shape=(1,)))

# Final layer is a single neuron, since we want to output a single value
model_1.add(keras.layers.Dense(1))

# Compile the model using a standard optimizer and loss function for regression
model_1.compile(optimizer='adam', loss='mse', metrics=['mae'])

2. Train the Model

Once we've defined the model, we can use our data to train it. Training involves passing an x value into the neural network, checking how far the network's output deviates from the expected y value, and adjusting the neurons' weights and biases so that the output is more likely to be correct the next time.

Training runs this process on the full dataset multiple times, and each full run-through is known as an epoch. The number of epochs to run during training is a parameter we can set.

During each epoch, data is run through the network in multiple batches. Each batch, several pieces of data are passed into the network, producing output values. These outputs' correctness is measured in aggregate and the network's weights and biases are adjusted accordingly, once per batch. The batch size is also a parameter we can set.

The code in the following cell uses the x and y values from our training data to train the model. It runs for 500 epochs, with 64 pieces of data in each batch. We also pass in some data for validation. As you will see when you run the cell, training can take a while to complete:


In [9]:
# Train the model on our training data while validating on our validation set
history_1 = model_1.fit(x_train, y_train, epochs=500, batch_size=64,
                    validation_data=(x_validate, y_validate))


Train on 600 samples, validate on 200 samples
Epoch 1/500
600/600 [==============================] - 1s 971us/sample - loss: 0.6936 - mae: 0.6897 - val_loss: 0.6396 - val_mae: 0.6501
Epoch 2/500
600/600 [==============================] - 0s 50us/sample - loss: 0.5965 - mae: 0.6254 - val_loss: 0.5594 - val_mae: 0.6035
Epoch 3/500
600/600 [==============================] - 0s 47us/sample - loss: 0.5240 - mae: 0.5830 - val_loss: 0.5021 - val_mae: 0.5765
Epoch 4/500
600/600 [==============================] - 0s 47us/sample - loss: 0.4724 - mae: 0.5549 - val_loss: 0.4634 - val_mae: 0.5615
Epoch 5/500
600/600 [==============================] - 0s 55us/sample - loss: 0.4392 - mae: 0.5390 - val_loss: 0.4375 - val_mae: 0.5533
Epoch 6/500
600/600 [==============================] - 0s 49us/sample - loss: 0.4174 - mae: 0.5305 - val_loss: 0.4215 - val_mae: 0.5487
Epoch 7/500
600/600 [==============================] - 0s 46us/sample - loss: 0.4026 - mae: 0.5244 - val_loss: 0.4119 - val_mae: 0.5464
Epoch 8/500
600/600 [==============================] - 0s 45us/sample - loss: 0.3939 - mae: 0.5225 - val_loss: 0.4057 - val_mae: 0.5452
Epoch 9/500
600/600 [==============================] - 0s 44us/sample - loss: 0.3880 - mae: 0.5216 - val_loss: 0.4015 - val_mae: 0.5439
Epoch 10/500
600/600 [==============================] - 0s 46us/sample - loss: 0.3836 - mae: 0.5210 - val_loss: 0.3981 - val_mae: 0.5425
Epoch 11/500
600/600 [==============================] - 0s 44us/sample - loss: 0.3802 - mae: 0.5205 - val_loss: 0.3950 - val_mae: 0.5412
Epoch 12/500
600/600 [==============================] - 0s 43us/sample - loss: 0.3770 - mae: 0.5200 - val_loss: 0.3922 - val_mae: 0.5400
Epoch 13/500
600/600 [==============================] - 0s 47us/sample - loss: 0.3741 - mae: 0.5189 - val_loss: 0.3894 - val_mae: 0.5385
Epoch 14/500
600/600 [==============================] - 0s 41us/sample - loss: 0.3712 - mae: 0.5173 - val_loss: 0.3866 - val_mae: 0.5368
Epoch 15/500
600/600 [==============================] - 0s 51us/sample - loss: 0.3686 - mae: 0.5162 - val_loss: 0.3837 - val_mae: 0.5354
Epoch 16/500
600/600 [==============================] - 0s 45us/sample - loss: 0.3655 - mae: 0.5143 - val_loss: 0.3808 - val_mae: 0.5335
Epoch 17/500
600/600 [==============================] - 0s 42us/sample - loss: 0.3627 - mae: 0.5122 - val_loss: 0.3777 - val_mae: 0.5314
Epoch 18/500
600/600 [==============================] - 0s 53us/sample - loss: 0.3597 - mae: 0.5101 - val_loss: 0.3748 - val_mae: 0.5296
Epoch 19/500
600/600 [==============================] - 0s 44us/sample - loss: 0.3567 - mae: 0.5080 - val_loss: 0.3717 - val_mae: 0.5276
Epoch 20/500
600/600 [==============================] - 0s 42us/sample - loss: 0.3538 - mae: 0.5059 - val_loss: 0.3686 - val_mae: 0.5256
Epoch 21/500
600/600 [==============================] - 0s 47us/sample - loss: 0.3507 - mae: 0.5037 - val_loss: 0.3654 - val_mae: 0.5234
Epoch 22/500
600/600 [==============================] - 0s 47us/sample - loss: 0.3477 - mae: 0.5012 - val_loss: 0.3622 - val_mae: 0.5211
Epoch 23/500
600/600 [==============================] - 0s 41us/sample - loss: 0.3447 - mae: 0.4993 - val_loss: 0.3591 - val_mae: 0.5195
Epoch 24/500
600/600 [==============================] - 0s 56us/sample - loss: 0.3414 - mae: 0.4970 - val_loss: 0.3558 - val_mae: 0.5172
Epoch 25/500
600/600 [==============================] - 0s 51us/sample - loss: 0.3385 - mae: 0.4949 - val_loss: 0.3526 - val_mae: 0.5153
Epoch 26/500
600/600 [==============================] - 0s 49us/sample - loss: 0.3352 - mae: 0.4926 - val_loss: 0.3493 - val_mae: 0.5130
Epoch 27/500
600/600 [==============================] - 0s 53us/sample - loss: 0.3321 - mae: 0.4904 - val_loss: 0.3461 - val_mae: 0.5110
Epoch 28/500
600/600 [==============================] - 0s 56us/sample - loss: 0.3288 - mae: 0.4880 - val_loss: 0.3429 - val_mae: 0.5087
Epoch 29/500
600/600 [==============================] - 0s 51us/sample - loss: 0.3257 - mae: 0.4854 - val_loss: 0.3395 - val_mae: 0.5064
Epoch 30/500
600/600 [==============================] - 0s 50us/sample - loss: 0.3227 - mae: 0.4831 - val_loss: 0.3362 - val_mae: 0.5041
Epoch 31/500
600/600 [==============================] - 0s 53us/sample - loss: 0.3195 - mae: 0.4806 - val_loss: 0.3330 - val_mae: 0.5018
Epoch 32/500
600/600 [==============================] - 0s 55us/sample - loss: 0.3165 - mae: 0.4782 - val_loss: 0.3298 - val_mae: 0.4996
Epoch 33/500
600/600 [==============================] - 0s 56us/sample - loss: 0.3133 - mae: 0.4760 - val_loss: 0.3267 - val_mae: 0.4976
Epoch 34/500
600/600 [==============================] - 0s 56us/sample - loss: 0.3103 - mae: 0.4738 - val_loss: 0.3235 - val_mae: 0.4952
Epoch 35/500
600/600 [==============================] - 0s 51us/sample - loss: 0.3072 - mae: 0.4713 - val_loss: 0.3203 - val_mae: 0.4930
Epoch 36/500
600/600 [==============================] - 0s 100us/sample - loss: 0.3042 - mae: 0.4694 - val_loss: 0.3173 - val_mae: 0.4913
Epoch 37/500
600/600 [==============================] - 0s 50us/sample - loss: 0.3012 - mae: 0.4673 - val_loss: 0.3141 - val_mae: 0.4890
Epoch 38/500
600/600 [==============================] - 0s 55us/sample - loss: 0.2981 - mae: 0.4651 - val_loss: 0.3111 - val_mae: 0.4869
Epoch 39/500
600/600 [==============================] - 0s 51us/sample - loss: 0.2952 - mae: 0.4625 - val_loss: 0.3078 - val_mae: 0.4841
Epoch 40/500
600/600 [==============================] - 0s 43us/sample - loss: 0.2921 - mae: 0.4602 - val_loss: 0.3049 - val_mae: 0.4822
Epoch 41/500
600/600 [==============================] - 0s 42us/sample - loss: 0.2891 - mae: 0.4585 - val_loss: 0.3021 - val_mae: 0.4810
Epoch 42/500
600/600 [==============================] - 0s 44us/sample - loss: 0.2861 - mae: 0.4568 - val_loss: 0.2991 - val_mae: 0.4790
Epoch 43/500
600/600 [==============================] - 0s 48us/sample - loss: 0.2832 - mae: 0.4546 - val_loss: 0.2961 - val_mae: 0.4767
Epoch 44/500
600/600 [==============================] - 0s 42us/sample - loss: 0.2803 - mae: 0.4523 - val_loss: 0.2931 - val_mae: 0.4741
Epoch 45/500
600/600 [==============================] - 0s 41us/sample - loss: 0.2775 - mae: 0.4503 - val_loss: 0.2902 - val_mae: 0.4723
Epoch 46/500
600/600 [==============================] - 0s 48us/sample - loss: 0.2746 - mae: 0.4482 - val_loss: 0.2873 - val_mae: 0.4701
Epoch 47/500
600/600 [==============================] - 0s 45us/sample - loss: 0.2719 - mae: 0.4464 - val_loss: 0.2846 - val_mae: 0.4685
Epoch 48/500
600/600 [==============================] - 0s 44us/sample - loss: 0.2691 - mae: 0.4444 - val_loss: 0.2818 - val_mae: 0.4666
Epoch 49/500
600/600 [==============================] - 0s 44us/sample - loss: 0.2663 - mae: 0.4425 - val_loss: 0.2791 - val_mae: 0.4646
Epoch 50/500
600/600 [==============================] - 0s 46us/sample - loss: 0.2636 - mae: 0.4404 - val_loss: 0.2764 - val_mae: 0.4625
Epoch 51/500
600/600 [==============================] - 0s 49us/sample - loss: 0.2610 - mae: 0.4382 - val_loss: 0.2736 - val_mae: 0.4599
Epoch 52/500
600/600 [==============================] - 0s 41us/sample - loss: 0.2583 - mae: 0.4361 - val_loss: 0.2711 - val_mae: 0.4580
Epoch 53/500
600/600 [==============================] - 0s 46us/sample - loss: 0.2558 - mae: 0.4344 - val_loss: 0.2685 - val_mae: 0.4561
Epoch 54/500
600/600 [==============================] - 0s 46us/sample - loss: 0.2532 - mae: 0.4326 - val_loss: 0.2659 - val_mae: 0.4539
Epoch 55/500
600/600 [==============================] - 0s 49us/sample - loss: 0.2508 - mae: 0.4307 - val_loss: 0.2634 - val_mae: 0.4518
Epoch 56/500
600/600 [==============================] - 0s 65us/sample - loss: 0.2483 - mae: 0.4288 - val_loss: 0.2609 - val_mae: 0.4499
Epoch 57/500
600/600 [==============================] - 0s 49us/sample - loss: 0.2459 - mae: 0.4271 - val_loss: 0.2586 - val_mae: 0.4485
Epoch 58/500
600/600 [==============================] - 0s 51us/sample - loss: 0.2436 - mae: 0.4255 - val_loss: 0.2561 - val_mae: 0.4464
Epoch 59/500
600/600 [==============================] - 0s 51us/sample - loss: 0.2411 - mae: 0.4239 - val_loss: 0.2540 - val_mae: 0.4451
Epoch 60/500
600/600 [==============================] - 0s 47us/sample - loss: 0.2387 - mae: 0.4220 - val_loss: 0.2516 - val_mae: 0.4431
Epoch 61/500
600/600 [==============================] - 0s 46us/sample - loss: 0.2365 - mae: 0.4202 - val_loss: 0.2493 - val_mae: 0.4411
Epoch 62/500
600/600 [==============================] - 0s 50us/sample - loss: 0.2343 - mae: 0.4186 - val_loss: 0.2472 - val_mae: 0.4395
Epoch 63/500
600/600 [==============================] - 0s 52us/sample - loss: 0.2322 - mae: 0.4169 - val_loss: 0.2450 - val_mae: 0.4375
Epoch 64/500
600/600 [==============================] - 0s 51us/sample - loss: 0.2301 - mae: 0.4151 - val_loss: 0.2428 - val_mae: 0.4355
Epoch 65/500
600/600 [==============================] - 0s 53us/sample - loss: 0.2280 - mae: 0.4134 - val_loss: 0.2408 - val_mae: 0.4338
Epoch 66/500
600/600 [==============================] - 0s 48us/sample - loss: 0.2260 - mae: 0.4118 - val_loss: 0.2388 - val_mae: 0.4323
Epoch 67/500
600/600 [==============================] - 0s 54us/sample - loss: 0.2241 - mae: 0.4104 - val_loss: 0.2369 - val_mae: 0.4308
Epoch 68/500
600/600 [==============================] - 0s 49us/sample - loss: 0.2222 - mae: 0.4089 - val_loss: 0.2351 - val_mae: 0.4293
Epoch 69/500
600/600 [==============================] - 0s 51us/sample - loss: 0.2204 - mae: 0.4076 - val_loss: 0.2334 - val_mae: 0.4280
Epoch 70/500
600/600 [==============================] - 0s 49us/sample - loss: 0.2188 - mae: 0.4062 - val_loss: 0.2314 - val_mae: 0.4255
Epoch 71/500
600/600 [==============================] - 0s 46us/sample - loss: 0.2168 - mae: 0.4043 - val_loss: 0.2297 - val_mae: 0.4246
Epoch 72/500
600/600 [==============================] - 0s 43us/sample - loss: 0.2151 - mae: 0.4031 - val_loss: 0.2280 - val_mae: 0.4231
Epoch 73/500
600/600 [==============================] - 0s 40us/sample - loss: 0.2135 - mae: 0.4019 - val_loss: 0.2265 - val_mae: 0.4224
Epoch 74/500
600/600 [==============================] - 0s 43us/sample - loss: 0.2120 - mae: 0.4007 - val_loss: 0.2247 - val_mae: 0.4203
Epoch 75/500
600/600 [==============================] - 0s 45us/sample - loss: 0.2102 - mae: 0.3992 - val_loss: 0.2233 - val_mae: 0.4194
Epoch 76/500
600/600 [==============================] - 0s 43us/sample - loss: 0.2087 - mae: 0.3980 - val_loss: 0.2216 - val_mae: 0.4178
Epoch 77/500
600/600 [==============================] - 0s 43us/sample - loss: 0.2071 - mae: 0.3965 - val_loss: 0.2199 - val_mae: 0.4158
Epoch 78/500
600/600 [==============================] - 0s 42us/sample - loss: 0.2056 - mae: 0.3951 - val_loss: 0.2185 - val_mae: 0.4144
Epoch 79/500
600/600 [==============================] - 0s 44us/sample - loss: 0.2044 - mae: 0.3938 - val_loss: 0.2170 - val_mae: 0.4122
Epoch 80/500
600/600 [==============================] - 0s 41us/sample - loss: 0.2029 - mae: 0.3926 - val_loss: 0.2159 - val_mae: 0.4123
Epoch 81/500
600/600 [==============================] - 0s 43us/sample - loss: 0.2015 - mae: 0.3915 - val_loss: 0.2145 - val_mae: 0.4108
Epoch 82/500
600/600 [==============================] - 0s 42us/sample - loss: 0.2002 - mae: 0.3902 - val_loss: 0.2131 - val_mae: 0.4091
Epoch 83/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1989 - mae: 0.3890 - val_loss: 0.2119 - val_mae: 0.4081
Epoch 84/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1977 - mae: 0.3878 - val_loss: 0.2107 - val_mae: 0.4071
Epoch 85/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1965 - mae: 0.3867 - val_loss: 0.2095 - val_mae: 0.4057
Epoch 86/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1953 - mae: 0.3857 - val_loss: 0.2082 - val_mae: 0.4044
Epoch 87/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1941 - mae: 0.3843 - val_loss: 0.2072 - val_mae: 0.4032
Epoch 88/500
600/600 [==============================] - 0s 41us/sample - loss: 0.1930 - mae: 0.3834 - val_loss: 0.2062 - val_mae: 0.4028
Epoch 89/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1920 - mae: 0.3825 - val_loss: 0.2053 - val_mae: 0.4018
Epoch 90/500
600/600 [==============================] - 0s 60us/sample - loss: 0.1913 - mae: 0.3819 - val_loss: 0.2046 - val_mae: 0.4018
Epoch 91/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1902 - mae: 0.3808 - val_loss: 0.2033 - val_mae: 0.3994
Epoch 92/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1892 - mae: 0.3796 - val_loss: 0.2025 - val_mae: 0.3989
Epoch 93/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1882 - mae: 0.3786 - val_loss: 0.2015 - val_mae: 0.3970
Epoch 94/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1875 - mae: 0.3776 - val_loss: 0.2006 - val_mae: 0.3959
Epoch 95/500
600/600 [==============================] - 0s 58us/sample - loss: 0.1870 - mae: 0.3768 - val_loss: 0.1998 - val_mae: 0.3941
Epoch 96/500
600/600 [==============================] - 0s 67us/sample - loss: 0.1861 - mae: 0.3760 - val_loss: 0.1992 - val_mae: 0.3947
Epoch 97/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1852 - mae: 0.3751 - val_loss: 0.1984 - val_mae: 0.3937
Epoch 98/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1843 - mae: 0.3742 - val_loss: 0.1980 - val_mae: 0.3939
Epoch 99/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1837 - mae: 0.3737 - val_loss: 0.1976 - val_mae: 0.3940
Epoch 100/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1832 - mae: 0.3733 - val_loss: 0.1970 - val_mae: 0.3936
Epoch 101/500
600/600 [==============================] - 0s 44us/sample - loss: 0.1828 - mae: 0.3727 - val_loss: 0.1960 - val_mae: 0.3910
Epoch 102/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1820 - mae: 0.3717 - val_loss: 0.1956 - val_mae: 0.3913
Epoch 103/500
600/600 [==============================] - 0s 64us/sample - loss: 0.1812 - mae: 0.3708 - val_loss: 0.1950 - val_mae: 0.3903
Epoch 104/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1806 - mae: 0.3701 - val_loss: 0.1946 - val_mae: 0.3898
Epoch 105/500
600/600 [==============================] - 0s 58us/sample - loss: 0.1802 - mae: 0.3695 - val_loss: 0.1939 - val_mae: 0.3886
Epoch 106/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1795 - mae: 0.3686 - val_loss: 0.1932 - val_mae: 0.3871
Epoch 107/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1790 - mae: 0.3679 - val_loss: 0.1928 - val_mae: 0.3866
Epoch 108/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1786 - mae: 0.3674 - val_loss: 0.1924 - val_mae: 0.3864
Epoch 109/500
600/600 [==============================] - 0s 40us/sample - loss: 0.1783 - mae: 0.3667 - val_loss: 0.1919 - val_mae: 0.3849
Epoch 110/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1781 - mae: 0.3666 - val_loss: 0.1919 - val_mae: 0.3861
Epoch 111/500
600/600 [==============================] - 0s 68us/sample - loss: 0.1774 - mae: 0.3658 - val_loss: 0.1912 - val_mae: 0.3843
Epoch 112/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1770 - mae: 0.3653 - val_loss: 0.1911 - val_mae: 0.3846
Epoch 113/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1766 - mae: 0.3647 - val_loss: 0.1906 - val_mae: 0.3833
Epoch 114/500
600/600 [==============================] - 0s 41us/sample - loss: 0.1763 - mae: 0.3642 - val_loss: 0.1903 - val_mae: 0.3831
Epoch 115/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1758 - mae: 0.3636 - val_loss: 0.1898 - val_mae: 0.3817
Epoch 116/500
600/600 [==============================] - 0s 41us/sample - loss: 0.1755 - mae: 0.3630 - val_loss: 0.1897 - val_mae: 0.3821
Epoch 117/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1752 - mae: 0.3627 - val_loss: 0.1893 - val_mae: 0.3810
Epoch 118/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1749 - mae: 0.3621 - val_loss: 0.1890 - val_mae: 0.3805
Epoch 119/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1747 - mae: 0.3617 - val_loss: 0.1888 - val_mae: 0.3802
Epoch 120/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1743 - mae: 0.3612 - val_loss: 0.1885 - val_mae: 0.3794
Epoch 121/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1743 - mae: 0.3610 - val_loss: 0.1885 - val_mae: 0.3803
Epoch 122/500
600/600 [==============================] - 0s 60us/sample - loss: 0.1740 - mae: 0.3608 - val_loss: 0.1884 - val_mae: 0.3802
Epoch 123/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1736 - mae: 0.3602 - val_loss: 0.1879 - val_mae: 0.3786
Epoch 124/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1737 - mae: 0.3597 - val_loss: 0.1876 - val_mae: 0.3765
Epoch 125/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1738 - mae: 0.3597 - val_loss: 0.1876 - val_mae: 0.3780
Epoch 126/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1734 - mae: 0.3591 - val_loss: 0.1872 - val_mae: 0.3762
Epoch 127/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1727 - mae: 0.3583 - val_loss: 0.1873 - val_mae: 0.3775
Epoch 128/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1726 - mae: 0.3583 - val_loss: 0.1872 - val_mae: 0.3776
Epoch 129/500
600/600 [==============================] - 0s 44us/sample - loss: 0.1724 - mae: 0.3579 - val_loss: 0.1869 - val_mae: 0.3763
Epoch 130/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1723 - mae: 0.3575 - val_loss: 0.1867 - val_mae: 0.3757
Epoch 131/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1722 - mae: 0.3573 - val_loss: 0.1866 - val_mae: 0.3759
Epoch 132/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1720 - mae: 0.3572 - val_loss: 0.1868 - val_mae: 0.3770
Epoch 133/500
600/600 [==============================] - 0s 56us/sample - loss: 0.1721 - mae: 0.3570 - val_loss: 0.1864 - val_mae: 0.3754
Epoch 134/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1717 - mae: 0.3566 - val_loss: 0.1864 - val_mae: 0.3754
Epoch 135/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1717 - mae: 0.3563 - val_loss: 0.1861 - val_mae: 0.3741
Epoch 136/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1715 - mae: 0.3559 - val_loss: 0.1861 - val_mae: 0.3744
Epoch 137/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1714 - mae: 0.3558 - val_loss: 0.1861 - val_mae: 0.3748
Epoch 138/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1713 - mae: 0.3555 - val_loss: 0.1859 - val_mae: 0.3737
Epoch 139/500
600/600 [==============================] - 0s 41us/sample - loss: 0.1712 - mae: 0.3551 - val_loss: 0.1857 - val_mae: 0.3731
Epoch 140/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1712 - mae: 0.3551 - val_loss: 0.1857 - val_mae: 0.3732
Epoch 141/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1710 - mae: 0.3547 - val_loss: 0.1856 - val_mae: 0.3724
Epoch 142/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1713 - mae: 0.3546 - val_loss: 0.1855 - val_mae: 0.3718
Epoch 143/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1711 - mae: 0.3545 - val_loss: 0.1857 - val_mae: 0.3740
Epoch 144/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1708 - mae: 0.3545 - val_loss: 0.1856 - val_mae: 0.3733
Epoch 145/500
600/600 [==============================] - 0s 55us/sample - loss: 0.1708 - mae: 0.3541 - val_loss: 0.1854 - val_mae: 0.3717
Epoch 146/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1707 - mae: 0.3539 - val_loss: 0.1854 - val_mae: 0.3720
Epoch 147/500
600/600 [==============================] - 0s 55us/sample - loss: 0.1706 - mae: 0.3539 - val_loss: 0.1854 - val_mae: 0.3725
Epoch 148/500
600/600 [==============================] - 0s 55us/sample - loss: 0.1706 - mae: 0.3537 - val_loss: 0.1853 - val_mae: 0.3722
Epoch 149/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1705 - mae: 0.3536 - val_loss: 0.1853 - val_mae: 0.3725
Epoch 150/500
600/600 [==============================] - 0s 56us/sample - loss: 0.1707 - mae: 0.3537 - val_loss: 0.1853 - val_mae: 0.3720
Epoch 151/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1704 - mae: 0.3532 - val_loss: 0.1851 - val_mae: 0.3704
Epoch 152/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1705 - mae: 0.3530 - val_loss: 0.1851 - val_mae: 0.3709
Epoch 153/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1703 - mae: 0.3529 - val_loss: 0.1851 - val_mae: 0.3714
Epoch 154/500
600/600 [==============================] - 0s 63us/sample - loss: 0.1703 - mae: 0.3530 - val_loss: 0.1852 - val_mae: 0.3720
Epoch 155/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1703 - mae: 0.3529 - val_loss: 0.1851 - val_mae: 0.3713
Epoch 156/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1702 - mae: 0.3526 - val_loss: 0.1850 - val_mae: 0.3711
Epoch 157/500
600/600 [==============================] - 0s 55us/sample - loss: 0.1701 - mae: 0.3526 - val_loss: 0.1852 - val_mae: 0.3719
Epoch 158/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1701 - mae: 0.3528 - val_loss: 0.1852 - val_mae: 0.3721
Epoch 159/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1705 - mae: 0.3528 - val_loss: 0.1849 - val_mae: 0.3698
Epoch 160/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1701 - mae: 0.3525 - val_loss: 0.1852 - val_mae: 0.3723
Epoch 161/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1701 - mae: 0.3528 - val_loss: 0.1851 - val_mae: 0.3721
Epoch 162/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1701 - mae: 0.3527 - val_loss: 0.1851 - val_mae: 0.3717
Epoch 163/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1701 - mae: 0.3527 - val_loss: 0.1852 - val_mae: 0.3722
Epoch 164/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1704 - mae: 0.3531 - val_loss: 0.1852 - val_mae: 0.3722
Epoch 165/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1700 - mae: 0.3525 - val_loss: 0.1847 - val_mae: 0.3697
Epoch 166/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1702 - mae: 0.3518 - val_loss: 0.1847 - val_mae: 0.3694
Epoch 167/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1704 - mae: 0.3519 - val_loss: 0.1847 - val_mae: 0.3680
Epoch 168/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1699 - mae: 0.3516 - val_loss: 0.1848 - val_mae: 0.3704
Epoch 169/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1700 - mae: 0.3522 - val_loss: 0.1851 - val_mae: 0.3718
Epoch 170/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1700 - mae: 0.3524 - val_loss: 0.1851 - val_mae: 0.3720
Epoch 171/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1699 - mae: 0.3522 - val_loss: 0.1848 - val_mae: 0.3702
Epoch 172/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1698 - mae: 0.3518 - val_loss: 0.1849 - val_mae: 0.3711
Epoch 173/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1699 - mae: 0.3521 - val_loss: 0.1849 - val_mae: 0.3710
Epoch 174/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1699 - mae: 0.3521 - val_loss: 0.1849 - val_mae: 0.3711
Epoch 175/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1700 - mae: 0.3518 - val_loss: 0.1847 - val_mae: 0.3699
Epoch 176/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1699 - mae: 0.3517 - val_loss: 0.1847 - val_mae: 0.3701
Epoch 177/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1702 - mae: 0.3524 - val_loss: 0.1852 - val_mae: 0.3721
Epoch 178/500
600/600 [==============================] - 0s 62us/sample - loss: 0.1700 - mae: 0.3523 - val_loss: 0.1849 - val_mae: 0.3710
Epoch 179/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1697 - mae: 0.3517 - val_loss: 0.1847 - val_mae: 0.3701
Epoch 180/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1703 - mae: 0.3515 - val_loss: 0.1846 - val_mae: 0.3681
Epoch 181/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1697 - mae: 0.3515 - val_loss: 0.1849 - val_mae: 0.3708
Epoch 182/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1698 - mae: 0.3518 - val_loss: 0.1850 - val_mae: 0.3715
Epoch 183/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1698 - mae: 0.3520 - val_loss: 0.1848 - val_mae: 0.3708
Epoch 184/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1698 - mae: 0.3516 - val_loss: 0.1846 - val_mae: 0.3690
Epoch 185/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1699 - mae: 0.3514 - val_loss: 0.1846 - val_mae: 0.3698
Epoch 186/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1700 - mae: 0.3517 - val_loss: 0.1848 - val_mae: 0.3706
Epoch 187/500
600/600 [==============================] - 0s 56us/sample - loss: 0.1696 - mae: 0.3513 - val_loss: 0.1846 - val_mae: 0.3693
Epoch 188/500
600/600 [==============================] - 0s 63us/sample - loss: 0.1697 - mae: 0.3511 - val_loss: 0.1845 - val_mae: 0.3687
Epoch 189/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1698 - mae: 0.3508 - val_loss: 0.1845 - val_mae: 0.3675
Epoch 190/500
600/600 [==============================] - 0s 56us/sample - loss: 0.1699 - mae: 0.3510 - val_loss: 0.1845 - val_mae: 0.3688
Epoch 191/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1698 - mae: 0.3509 - val_loss: 0.1846 - val_mae: 0.3693
Epoch 192/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1698 - mae: 0.3512 - val_loss: 0.1848 - val_mae: 0.3706
Epoch 193/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1700 - mae: 0.3520 - val_loss: 0.1850 - val_mae: 0.3714
Epoch 194/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1698 - mae: 0.3513 - val_loss: 0.1845 - val_mae: 0.3684
Epoch 195/500
600/600 [==============================] - 0s 59us/sample - loss: 0.1697 - mae: 0.3509 - val_loss: 0.1845 - val_mae: 0.3687
Epoch 196/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1696 - mae: 0.3509 - val_loss: 0.1846 - val_mae: 0.3691
Epoch 197/500
600/600 [==============================] - 0s 76us/sample - loss: 0.1697 - mae: 0.3508 - val_loss: 0.1845 - val_mae: 0.3684
Epoch 198/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1698 - mae: 0.3506 - val_loss: 0.1845 - val_mae: 0.3683
Epoch 199/500
600/600 [==============================] - 0s 44us/sample - loss: 0.1698 - mae: 0.3510 - val_loss: 0.1848 - val_mae: 0.3703
Epoch 200/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1696 - mae: 0.3511 - val_loss: 0.1846 - val_mae: 0.3690
Epoch 201/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1697 - mae: 0.3509 - val_loss: 0.1846 - val_mae: 0.3694
Epoch 202/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1697 - mae: 0.3512 - val_loss: 0.1847 - val_mae: 0.3696
Epoch 203/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1697 - mae: 0.3513 - val_loss: 0.1850 - val_mae: 0.3708
Epoch 204/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1696 - mae: 0.3513 - val_loss: 0.1847 - val_mae: 0.3697
Epoch 205/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1696 - mae: 0.3508 - val_loss: 0.1845 - val_mae: 0.3685
Epoch 206/500
600/600 [==============================] - 0s 44us/sample - loss: 0.1699 - mae: 0.3504 - val_loss: 0.1845 - val_mae: 0.3669
Epoch 207/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1697 - mae: 0.3500 - val_loss: 0.1845 - val_mae: 0.3680
Epoch 208/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1696 - mae: 0.3503 - val_loss: 0.1846 - val_mae: 0.3687
Epoch 209/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1696 - mae: 0.3505 - val_loss: 0.1846 - val_mae: 0.3690
Epoch 210/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1698 - mae: 0.3506 - val_loss: 0.1846 - val_mae: 0.3687
Epoch 211/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1699 - mae: 0.3513 - val_loss: 0.1849 - val_mae: 0.3703
Epoch 212/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1697 - mae: 0.3510 - val_loss: 0.1846 - val_mae: 0.3693
Epoch 213/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1697 - mae: 0.3504 - val_loss: 0.1845 - val_mae: 0.3677
Epoch 214/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1697 - mae: 0.3502 - val_loss: 0.1845 - val_mae: 0.3681
Epoch 215/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1695 - mae: 0.3505 - val_loss: 0.1847 - val_mae: 0.3698
Epoch 216/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1696 - mae: 0.3510 - val_loss: 0.1848 - val_mae: 0.3702
Epoch 217/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1698 - mae: 0.3512 - val_loss: 0.1846 - val_mae: 0.3694
Epoch 218/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3509 - val_loss: 0.1847 - val_mae: 0.3699
Epoch 219/500
600/600 [==============================] - 0s 56us/sample - loss: 0.1696 - mae: 0.3511 - val_loss: 0.1847 - val_mae: 0.3700
Epoch 220/500
600/600 [==============================] - 0s 60us/sample - loss: 0.1697 - mae: 0.3513 - val_loss: 0.1848 - val_mae: 0.3705
Epoch 221/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1697 - mae: 0.3513 - val_loss: 0.1847 - val_mae: 0.3699
Epoch 222/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1698 - mae: 0.3515 - val_loss: 0.1848 - val_mae: 0.3707
Epoch 223/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1697 - mae: 0.3514 - val_loss: 0.1845 - val_mae: 0.3695
Epoch 224/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1696 - mae: 0.3509 - val_loss: 0.1845 - val_mae: 0.3691
Epoch 225/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3511 - val_loss: 0.1846 - val_mae: 0.3695
Epoch 226/500
600/600 [==============================] - 0s 57us/sample - loss: 0.1697 - mae: 0.3510 - val_loss: 0.1845 - val_mae: 0.3691
Epoch 227/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1698 - mae: 0.3513 - val_loss: 0.1846 - val_mae: 0.3699
Epoch 228/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1699 - mae: 0.3510 - val_loss: 0.1844 - val_mae: 0.3685
Epoch 229/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1697 - mae: 0.3510 - val_loss: 0.1845 - val_mae: 0.3691
Epoch 230/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1696 - mae: 0.3510 - val_loss: 0.1846 - val_mae: 0.3696
Epoch 231/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1696 - mae: 0.3508 - val_loss: 0.1845 - val_mae: 0.3689
Epoch 232/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1697 - mae: 0.3512 - val_loss: 0.1846 - val_mae: 0.3697
Epoch 233/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1698 - mae: 0.3509 - val_loss: 0.1845 - val_mae: 0.3689
Epoch 234/500
600/600 [==============================] - 0s 41us/sample - loss: 0.1696 - mae: 0.3509 - val_loss: 0.1846 - val_mae: 0.3694
Epoch 235/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1696 - mae: 0.3511 - val_loss: 0.1846 - val_mae: 0.3693
Epoch 236/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1700 - mae: 0.3506 - val_loss: 0.1844 - val_mae: 0.3673
Epoch 237/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1698 - mae: 0.3502 - val_loss: 0.1844 - val_mae: 0.3676
Epoch 238/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1696 - mae: 0.3505 - val_loss: 0.1845 - val_mae: 0.3690
Epoch 239/500
600/600 [==============================] - 0s 41us/sample - loss: 0.1697 - mae: 0.3508 - val_loss: 0.1845 - val_mae: 0.3691
Epoch 240/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1697 - mae: 0.3506 - val_loss: 0.1844 - val_mae: 0.3676
Epoch 241/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1698 - mae: 0.3502 - val_loss: 0.1844 - val_mae: 0.3674
Epoch 242/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1697 - mae: 0.3507 - val_loss: 0.1847 - val_mae: 0.3696
Epoch 243/500
600/600 [==============================] - 0s 44us/sample - loss: 0.1697 - mae: 0.3508 - val_loss: 0.1845 - val_mae: 0.3685
Epoch 244/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1697 - mae: 0.3505 - val_loss: 0.1846 - val_mae: 0.3689
Epoch 245/500
600/600 [==============================] - 0s 44us/sample - loss: 0.1701 - mae: 0.3519 - val_loss: 0.1856 - val_mae: 0.3727
Epoch 246/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1701 - mae: 0.3519 - val_loss: 0.1850 - val_mae: 0.3708
Epoch 247/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1698 - mae: 0.3516 - val_loss: 0.1848 - val_mae: 0.3702
Epoch 248/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1697 - mae: 0.3508 - val_loss: 0.1844 - val_mae: 0.3671
Epoch 249/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1700 - mae: 0.3506 - val_loss: 0.1844 - val_mae: 0.3682
Epoch 250/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3503 - val_loss: 0.1844 - val_mae: 0.3676
Epoch 251/500
600/600 [==============================] - 0s 61us/sample - loss: 0.1697 - mae: 0.3504 - val_loss: 0.1844 - val_mae: 0.3676
Epoch 252/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1695 - mae: 0.3502 - val_loss: 0.1845 - val_mae: 0.3687
Epoch 253/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1695 - mae: 0.3507 - val_loss: 0.1847 - val_mae: 0.3698
Epoch 254/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1697 - mae: 0.3512 - val_loss: 0.1849 - val_mae: 0.3704
Epoch 255/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1698 - mae: 0.3514 - val_loss: 0.1848 - val_mae: 0.3700
Epoch 256/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1697 - mae: 0.3509 - val_loss: 0.1845 - val_mae: 0.3680
Epoch 257/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1696 - mae: 0.3503 - val_loss: 0.1844 - val_mae: 0.3679
Epoch 258/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1696 - mae: 0.3503 - val_loss: 0.1845 - val_mae: 0.3685
Epoch 259/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1695 - mae: 0.3505 - val_loss: 0.1846 - val_mae: 0.3689
Epoch 260/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1696 - mae: 0.3508 - val_loss: 0.1847 - val_mae: 0.3698
Epoch 261/500
600/600 [==============================] - 0s 55us/sample - loss: 0.1697 - mae: 0.3511 - val_loss: 0.1847 - val_mae: 0.3698
Epoch 262/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1699 - mae: 0.3510 - val_loss: 0.1845 - val_mae: 0.3684
Epoch 263/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1697 - mae: 0.3506 - val_loss: 0.1845 - val_mae: 0.3685
Epoch 264/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1696 - mae: 0.3506 - val_loss: 0.1846 - val_mae: 0.3692
Epoch 265/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1698 - mae: 0.3513 - val_loss: 0.1848 - val_mae: 0.3700
Epoch 266/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1696 - mae: 0.3509 - val_loss: 0.1846 - val_mae: 0.3691
Epoch 267/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1696 - mae: 0.3508 - val_loss: 0.1847 - val_mae: 0.3696
Epoch 268/500
600/600 [==============================] - 0s 44us/sample - loss: 0.1697 - mae: 0.3507 - val_loss: 0.1845 - val_mae: 0.3681
Epoch 269/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1696 - mae: 0.3504 - val_loss: 0.1846 - val_mae: 0.3686
Epoch 270/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1697 - mae: 0.3510 - val_loss: 0.1848 - val_mae: 0.3699
Epoch 271/500
600/600 [==============================] - 0s 56us/sample - loss: 0.1699 - mae: 0.3516 - val_loss: 0.1848 - val_mae: 0.3701
Epoch 272/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1698 - mae: 0.3509 - val_loss: 0.1845 - val_mae: 0.3683
Epoch 273/500
600/600 [==============================] - 0s 41us/sample - loss: 0.1696 - mae: 0.3508 - val_loss: 0.1848 - val_mae: 0.3699
Epoch 274/500
600/600 [==============================] - 0s 41us/sample - loss: 0.1696 - mae: 0.3510 - val_loss: 0.1847 - val_mae: 0.3697
Epoch 275/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1696 - mae: 0.3508 - val_loss: 0.1846 - val_mae: 0.3690
Epoch 276/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1696 - mae: 0.3509 - val_loss: 0.1847 - val_mae: 0.3693
Epoch 277/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1695 - mae: 0.3504 - val_loss: 0.1845 - val_mae: 0.3679
Epoch 278/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1697 - mae: 0.3502 - val_loss: 0.1845 - val_mae: 0.3678
Epoch 279/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1697 - mae: 0.3505 - val_loss: 0.1846 - val_mae: 0.3687
Epoch 280/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1696 - mae: 0.3506 - val_loss: 0.1846 - val_mae: 0.3688
Epoch 281/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1698 - mae: 0.3510 - val_loss: 0.1848 - val_mae: 0.3700
Epoch 282/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3509 - val_loss: 0.1847 - val_mae: 0.3694
Epoch 283/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1696 - mae: 0.3507 - val_loss: 0.1846 - val_mae: 0.3688
Epoch 284/500
600/600 [==============================] - 0s 44us/sample - loss: 0.1696 - mae: 0.3508 - val_loss: 0.1847 - val_mae: 0.3692
Epoch 285/500
600/600 [==============================] - 0s 58us/sample - loss: 0.1695 - mae: 0.3505 - val_loss: 0.1846 - val_mae: 0.3682
Epoch 286/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1699 - mae: 0.3501 - val_loss: 0.1846 - val_mae: 0.3664
Epoch 287/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1698 - mae: 0.3502 - val_loss: 0.1845 - val_mae: 0.3682
Epoch 288/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1696 - mae: 0.3504 - val_loss: 0.1846 - val_mae: 0.3683
Epoch 289/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1695 - mae: 0.3504 - val_loss: 0.1846 - val_mae: 0.3690
Epoch 290/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1696 - mae: 0.3506 - val_loss: 0.1847 - val_mae: 0.3689
Epoch 291/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1696 - mae: 0.3506 - val_loss: 0.1847 - val_mae: 0.3694
Epoch 292/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1698 - mae: 0.3508 - val_loss: 0.1846 - val_mae: 0.3683
Epoch 293/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1701 - mae: 0.3513 - val_loss: 0.1850 - val_mae: 0.3705
Epoch 294/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1702 - mae: 0.3509 - val_loss: 0.1845 - val_mae: 0.3678
Epoch 295/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1696 - mae: 0.3505 - val_loss: 0.1849 - val_mae: 0.3702
Epoch 296/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1697 - mae: 0.3510 - val_loss: 0.1848 - val_mae: 0.3699
Epoch 297/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1697 - mae: 0.3509 - val_loss: 0.1847 - val_mae: 0.3691
Epoch 298/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1696 - mae: 0.3507 - val_loss: 0.1848 - val_mae: 0.3695
Epoch 299/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1696 - mae: 0.3507 - val_loss: 0.1847 - val_mae: 0.3690
Epoch 300/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1696 - mae: 0.3505 - val_loss: 0.1846 - val_mae: 0.3684
Epoch 301/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1696 - mae: 0.3504 - val_loss: 0.1846 - val_mae: 0.3685
Epoch 302/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1698 - mae: 0.3507 - val_loss: 0.1848 - val_mae: 0.3696
Epoch 303/500
600/600 [==============================] - 0s 41us/sample - loss: 0.1695 - mae: 0.3506 - val_loss: 0.1846 - val_mae: 0.3684
Epoch 304/500
600/600 [==============================] - 0s 44us/sample - loss: 0.1700 - mae: 0.3502 - val_loss: 0.1845 - val_mae: 0.3667
Epoch 305/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1696 - mae: 0.3498 - val_loss: 0.1845 - val_mae: 0.3679
Epoch 306/500
600/600 [==============================] - 0s 56us/sample - loss: 0.1699 - mae: 0.3509 - val_loss: 0.1850 - val_mae: 0.3706
Epoch 307/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1697 - mae: 0.3513 - val_loss: 0.1847 - val_mae: 0.3694
Epoch 308/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1697 - mae: 0.3506 - val_loss: 0.1845 - val_mae: 0.3682
Epoch 309/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1697 - mae: 0.3506 - val_loss: 0.1847 - val_mae: 0.3691
Epoch 310/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1696 - mae: 0.3504 - val_loss: 0.1845 - val_mae: 0.3680
Epoch 311/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1699 - mae: 0.3503 - val_loss: 0.1845 - val_mae: 0.3677
Epoch 312/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1694 - mae: 0.3502 - val_loss: 0.1847 - val_mae: 0.3692
Epoch 313/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1698 - mae: 0.3512 - val_loss: 0.1850 - val_mae: 0.3706
Epoch 314/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1698 - mae: 0.3509 - val_loss: 0.1845 - val_mae: 0.3678
Epoch 315/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1697 - mae: 0.3503 - val_loss: 0.1845 - val_mae: 0.3674
Epoch 316/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1697 - mae: 0.3503 - val_loss: 0.1845 - val_mae: 0.3680
Epoch 317/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3501 - val_loss: 0.1845 - val_mae: 0.3675
Epoch 318/500
600/600 [==============================] - 0s 59us/sample - loss: 0.1697 - mae: 0.3500 - val_loss: 0.1845 - val_mae: 0.3674
Epoch 319/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1697 - mae: 0.3499 - val_loss: 0.1845 - val_mae: 0.3672
Epoch 320/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1697 - mae: 0.3503 - val_loss: 0.1846 - val_mae: 0.3685
Epoch 321/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1697 - mae: 0.3507 - val_loss: 0.1847 - val_mae: 0.3695
Epoch 322/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3505 - val_loss: 0.1845 - val_mae: 0.3677
Epoch 323/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1697 - mae: 0.3501 - val_loss: 0.1845 - val_mae: 0.3676
Epoch 324/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1696 - mae: 0.3502 - val_loss: 0.1845 - val_mae: 0.3680
Epoch 325/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1696 - mae: 0.3505 - val_loss: 0.1846 - val_mae: 0.3690
Epoch 326/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1696 - mae: 0.3504 - val_loss: 0.1845 - val_mae: 0.3682
Epoch 327/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1697 - mae: 0.3503 - val_loss: 0.1845 - val_mae: 0.3682
Epoch 328/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1696 - mae: 0.3504 - val_loss: 0.1845 - val_mae: 0.3682
Epoch 329/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1695 - mae: 0.3503 - val_loss: 0.1846 - val_mae: 0.3684
Epoch 330/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1696 - mae: 0.3506 - val_loss: 0.1846 - val_mae: 0.3691
Epoch 331/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1699 - mae: 0.3512 - val_loss: 0.1847 - val_mae: 0.3697
Epoch 332/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3507 - val_loss: 0.1846 - val_mae: 0.3688
Epoch 333/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1702 - mae: 0.3514 - val_loss: 0.1847 - val_mae: 0.3696
Epoch 334/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1695 - mae: 0.3505 - val_loss: 0.1845 - val_mae: 0.3678
Epoch 335/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3503 - val_loss: 0.1845 - val_mae: 0.3680
Epoch 336/500
600/600 [==============================] - 0s 40us/sample - loss: 0.1697 - mae: 0.3501 - val_loss: 0.1845 - val_mae: 0.3675
Epoch 337/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3502 - val_loss: 0.1846 - val_mae: 0.3688
Epoch 338/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1697 - mae: 0.3506 - val_loss: 0.1846 - val_mae: 0.3688
Epoch 339/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1700 - mae: 0.3513 - val_loss: 0.1851 - val_mae: 0.3711
Epoch 340/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1697 - mae: 0.3511 - val_loss: 0.1846 - val_mae: 0.3689
Epoch 341/500
600/600 [==============================] - 0s 55us/sample - loss: 0.1696 - mae: 0.3505 - val_loss: 0.1845 - val_mae: 0.3677
Epoch 342/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1701 - mae: 0.3509 - val_loss: 0.1848 - val_mae: 0.3700
Epoch 343/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1696 - mae: 0.3510 - val_loss: 0.1847 - val_mae: 0.3692
Epoch 344/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1697 - mae: 0.3506 - val_loss: 0.1846 - val_mae: 0.3682
Epoch 345/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1696 - mae: 0.3504 - val_loss: 0.1847 - val_mae: 0.3690
Epoch 346/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1695 - mae: 0.3511 - val_loss: 0.1851 - val_mae: 0.3711
Epoch 347/500
600/600 [==============================] - 0s 65us/sample - loss: 0.1697 - mae: 0.3513 - val_loss: 0.1849 - val_mae: 0.3701
Epoch 348/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1694 - mae: 0.3507 - val_loss: 0.1846 - val_mae: 0.3683
Epoch 349/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1696 - mae: 0.3501 - val_loss: 0.1845 - val_mae: 0.3672
Epoch 350/500
600/600 [==============================] - 0s 59us/sample - loss: 0.1698 - mae: 0.3505 - val_loss: 0.1846 - val_mae: 0.3684
Epoch 351/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3502 - val_loss: 0.1845 - val_mae: 0.3679
Epoch 352/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1695 - mae: 0.3504 - val_loss: 0.1847 - val_mae: 0.3692
Epoch 353/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1697 - mae: 0.3509 - val_loss: 0.1849 - val_mae: 0.3701
Epoch 354/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1696 - mae: 0.3508 - val_loss: 0.1847 - val_mae: 0.3689
Epoch 355/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1697 - mae: 0.3508 - val_loss: 0.1846 - val_mae: 0.3685
Epoch 356/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1701 - mae: 0.3504 - val_loss: 0.1846 - val_mae: 0.3664
Epoch 357/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1699 - mae: 0.3503 - val_loss: 0.1847 - val_mae: 0.3689
Epoch 358/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1696 - mae: 0.3504 - val_loss: 0.1846 - val_mae: 0.3684
Epoch 359/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1695 - mae: 0.3503 - val_loss: 0.1846 - val_mae: 0.3683
Epoch 360/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1696 - mae: 0.3504 - val_loss: 0.1846 - val_mae: 0.3681
Epoch 361/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1697 - mae: 0.3504 - val_loss: 0.1846 - val_mae: 0.3685
Epoch 362/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1698 - mae: 0.3503 - val_loss: 0.1845 - val_mae: 0.3676
Epoch 363/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1696 - mae: 0.3505 - val_loss: 0.1848 - val_mae: 0.3695
Epoch 364/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1696 - mae: 0.3506 - val_loss: 0.1847 - val_mae: 0.3688
Epoch 365/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1696 - mae: 0.3507 - val_loss: 0.1849 - val_mae: 0.3699
Epoch 366/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1697 - mae: 0.3511 - val_loss: 0.1849 - val_mae: 0.3701
Epoch 367/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1696 - mae: 0.3508 - val_loss: 0.1846 - val_mae: 0.3688
Epoch 368/500
600/600 [==============================] - 0s 39us/sample - loss: 0.1696 - mae: 0.3507 - val_loss: 0.1846 - val_mae: 0.3688
Epoch 369/500
600/600 [==============================] - 0s 40us/sample - loss: 0.1698 - mae: 0.3504 - val_loss: 0.1845 - val_mae: 0.3678
Epoch 370/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1697 - mae: 0.3507 - val_loss: 0.1848 - val_mae: 0.3697
Epoch 371/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1698 - mae: 0.3508 - val_loss: 0.1846 - val_mae: 0.3683
Epoch 372/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1696 - mae: 0.3506 - val_loss: 0.1847 - val_mae: 0.3692
Epoch 373/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1696 - mae: 0.3508 - val_loss: 0.1846 - val_mae: 0.3689
Epoch 374/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1697 - mae: 0.3504 - val_loss: 0.1845 - val_mae: 0.3677
Epoch 375/500
600/600 [==============================] - 0s 44us/sample - loss: 0.1697 - mae: 0.3506 - val_loss: 0.1846 - val_mae: 0.3691
Epoch 376/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1696 - mae: 0.3505 - val_loss: 0.1846 - val_mae: 0.3684
Epoch 377/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1697 - mae: 0.3502 - val_loss: 0.1845 - val_mae: 0.3678
Epoch 378/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1700 - mae: 0.3507 - val_loss: 0.1847 - val_mae: 0.3690
Epoch 379/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1695 - mae: 0.3501 - val_loss: 0.1845 - val_mae: 0.3670
Epoch 380/500
600/600 [==============================] - 0s 60us/sample - loss: 0.1696 - mae: 0.3501 - val_loss: 0.1846 - val_mae: 0.3683
Epoch 381/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1695 - mae: 0.3505 - val_loss: 0.1847 - val_mae: 0.3691
Epoch 382/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1696 - mae: 0.3506 - val_loss: 0.1847 - val_mae: 0.3690
Epoch 383/500
600/600 [==============================] - 0s 59us/sample - loss: 0.1696 - mae: 0.3506 - val_loss: 0.1847 - val_mae: 0.3693
Epoch 384/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1697 - mae: 0.3511 - val_loss: 0.1850 - val_mae: 0.3703
Epoch 385/500
600/600 [==============================] - 0s 57us/sample - loss: 0.1699 - mae: 0.3510 - val_loss: 0.1847 - val_mae: 0.3689
Epoch 386/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1697 - mae: 0.3511 - val_loss: 0.1851 - val_mae: 0.3709
Epoch 387/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1698 - mae: 0.3512 - val_loss: 0.1846 - val_mae: 0.3688
Epoch 388/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1696 - mae: 0.3506 - val_loss: 0.1846 - val_mae: 0.3687
Epoch 389/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1697 - mae: 0.3510 - val_loss: 0.1848 - val_mae: 0.3700
Epoch 390/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1696 - mae: 0.3510 - val_loss: 0.1847 - val_mae: 0.3694
Epoch 391/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1701 - mae: 0.3505 - val_loss: 0.1846 - val_mae: 0.3666
Epoch 392/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1697 - mae: 0.3501 - val_loss: 0.1846 - val_mae: 0.3681
Epoch 393/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3506 - val_loss: 0.1848 - val_mae: 0.3698
Epoch 394/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1696 - mae: 0.3510 - val_loss: 0.1847 - val_mae: 0.3693
Epoch 395/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1699 - mae: 0.3507 - val_loss: 0.1845 - val_mae: 0.3675
Epoch 396/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1695 - mae: 0.3501 - val_loss: 0.1847 - val_mae: 0.3693
Epoch 397/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1698 - mae: 0.3510 - val_loss: 0.1848 - val_mae: 0.3698
Epoch 398/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1696 - mae: 0.3508 - val_loss: 0.1846 - val_mae: 0.3687
Epoch 399/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1695 - mae: 0.3502 - val_loss: 0.1845 - val_mae: 0.3673
Epoch 400/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1697 - mae: 0.3498 - val_loss: 0.1845 - val_mae: 0.3667
Epoch 401/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1696 - mae: 0.3498 - val_loss: 0.1845 - val_mae: 0.3681
Epoch 402/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1696 - mae: 0.3505 - val_loss: 0.1847 - val_mae: 0.3692
Epoch 403/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1697 - mae: 0.3506 - val_loss: 0.1846 - val_mae: 0.3687
Epoch 404/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1699 - mae: 0.3502 - val_loss: 0.1846 - val_mae: 0.3667
Epoch 405/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1697 - mae: 0.3500 - val_loss: 0.1846 - val_mae: 0.3683
Epoch 406/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1696 - mae: 0.3503 - val_loss: 0.1847 - val_mae: 0.3689
Epoch 407/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1695 - mae: 0.3504 - val_loss: 0.1847 - val_mae: 0.3684
Epoch 408/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1697 - mae: 0.3502 - val_loss: 0.1846 - val_mae: 0.3673
Epoch 409/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1696 - mae: 0.3499 - val_loss: 0.1846 - val_mae: 0.3678
Epoch 410/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1696 - mae: 0.3502 - val_loss: 0.1846 - val_mae: 0.3682
Epoch 411/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1697 - mae: 0.3499 - val_loss: 0.1846 - val_mae: 0.3668
Epoch 412/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1697 - mae: 0.3496 - val_loss: 0.1846 - val_mae: 0.3673
Epoch 413/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1698 - mae: 0.3508 - val_loss: 0.1852 - val_mae: 0.3710
Epoch 414/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1703 - mae: 0.3519 - val_loss: 0.1854 - val_mae: 0.3716
Epoch 415/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1695 - mae: 0.3511 - val_loss: 0.1846 - val_mae: 0.3686
Epoch 416/500
600/600 [==============================] - 0s 55us/sample - loss: 0.1696 - mae: 0.3499 - val_loss: 0.1845 - val_mae: 0.3666
Epoch 417/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1700 - mae: 0.3496 - val_loss: 0.1846 - val_mae: 0.3665
Epoch 418/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1694 - mae: 0.3497 - val_loss: 0.1847 - val_mae: 0.3687
Epoch 419/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1695 - mae: 0.3505 - val_loss: 0.1849 - val_mae: 0.3698
Epoch 420/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1697 - mae: 0.3509 - val_loss: 0.1850 - val_mae: 0.3702
Epoch 421/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1697 - mae: 0.3511 - val_loss: 0.1849 - val_mae: 0.3700
Epoch 422/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3507 - val_loss: 0.1846 - val_mae: 0.3686
Epoch 423/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1695 - mae: 0.3502 - val_loss: 0.1845 - val_mae: 0.3677
Epoch 424/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1696 - mae: 0.3498 - val_loss: 0.1845 - val_mae: 0.3668
Epoch 425/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1697 - mae: 0.3497 - val_loss: 0.1845 - val_mae: 0.3671
Epoch 426/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1696 - mae: 0.3497 - val_loss: 0.1846 - val_mae: 0.3676
Epoch 427/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1696 - mae: 0.3500 - val_loss: 0.1847 - val_mae: 0.3683
Epoch 428/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1696 - mae: 0.3502 - val_loss: 0.1847 - val_mae: 0.3686
Epoch 429/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1697 - mae: 0.3506 - val_loss: 0.1848 - val_mae: 0.3694
Epoch 430/500
600/600 [==============================] - 0s 40us/sample - loss: 0.1698 - mae: 0.3502 - val_loss: 0.1846 - val_mae: 0.3675
Epoch 431/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1697 - mae: 0.3498 - val_loss: 0.1846 - val_mae: 0.3675
Epoch 432/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1696 - mae: 0.3504 - val_loss: 0.1850 - val_mae: 0.3703
Epoch 433/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1699 - mae: 0.3514 - val_loss: 0.1853 - val_mae: 0.3713
Epoch 434/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1700 - mae: 0.3510 - val_loss: 0.1846 - val_mae: 0.3686
Epoch 435/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1699 - mae: 0.3509 - val_loss: 0.1846 - val_mae: 0.3689
Epoch 436/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1696 - mae: 0.3509 - val_loss: 0.1849 - val_mae: 0.3703
Epoch 437/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1697 - mae: 0.3511 - val_loss: 0.1847 - val_mae: 0.3696
Epoch 438/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3509 - val_loss: 0.1846 - val_mae: 0.3691
Epoch 439/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1695 - mae: 0.3506 - val_loss: 0.1845 - val_mae: 0.3682
Epoch 440/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1698 - mae: 0.3506 - val_loss: 0.1845 - val_mae: 0.3683
Epoch 441/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1696 - mae: 0.3501 - val_loss: 0.1845 - val_mae: 0.3670
Epoch 442/500
600/600 [==============================] - 0s 59us/sample - loss: 0.1697 - mae: 0.3502 - val_loss: 0.1846 - val_mae: 0.3690
Epoch 443/500
600/600 [==============================] - 0s 82us/sample - loss: 0.1704 - mae: 0.3519 - val_loss: 0.1849 - val_mae: 0.3702
Epoch 444/500
600/600 [==============================] - 0s 62us/sample - loss: 0.1696 - mae: 0.3507 - val_loss: 0.1846 - val_mae: 0.3685
Epoch 445/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1697 - mae: 0.3504 - val_loss: 0.1845 - val_mae: 0.3679
Epoch 446/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1697 - mae: 0.3501 - val_loss: 0.1845 - val_mae: 0.3673
Epoch 447/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1700 - mae: 0.3501 - val_loss: 0.1845 - val_mae: 0.3671
Epoch 448/500
600/600 [==============================] - 0s 56us/sample - loss: 0.1705 - mae: 0.3515 - val_loss: 0.1852 - val_mae: 0.3713
Epoch 449/500
600/600 [==============================] - 0s 59us/sample - loss: 0.1698 - mae: 0.3512 - val_loss: 0.1846 - val_mae: 0.3687
Epoch 450/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1696 - mae: 0.3504 - val_loss: 0.1845 - val_mae: 0.3682
Epoch 451/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1695 - mae: 0.3504 - val_loss: 0.1846 - val_mae: 0.3687
Epoch 452/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1696 - mae: 0.3504 - val_loss: 0.1845 - val_mae: 0.3681
Epoch 453/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1696 - mae: 0.3502 - val_loss: 0.1846 - val_mae: 0.3683
Epoch 454/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1696 - mae: 0.3504 - val_loss: 0.1846 - val_mae: 0.3686
Epoch 455/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1698 - mae: 0.3506 - val_loss: 0.1846 - val_mae: 0.3682
Epoch 456/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1698 - mae: 0.3508 - val_loss: 0.1847 - val_mae: 0.3695
Epoch 457/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1697 - mae: 0.3511 - val_loss: 0.1847 - val_mae: 0.3697
Epoch 458/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1695 - mae: 0.3507 - val_loss: 0.1845 - val_mae: 0.3684
Epoch 459/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1698 - mae: 0.3504 - val_loss: 0.1845 - val_mae: 0.3677
Epoch 460/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1696 - mae: 0.3506 - val_loss: 0.1846 - val_mae: 0.3692
Epoch 461/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1696 - mae: 0.3509 - val_loss: 0.1847 - val_mae: 0.3696
Epoch 462/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1696 - mae: 0.3510 - val_loss: 0.1846 - val_mae: 0.3692
Epoch 463/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1698 - mae: 0.3506 - val_loss: 0.1845 - val_mae: 0.3674
Epoch 464/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1697 - mae: 0.3502 - val_loss: 0.1845 - val_mae: 0.3685
Epoch 465/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1695 - mae: 0.3506 - val_loss: 0.1847 - val_mae: 0.3695
Epoch 466/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1698 - mae: 0.3513 - val_loss: 0.1850 - val_mae: 0.3706
Epoch 467/500
600/600 [==============================] - 0s 40us/sample - loss: 0.1698 - mae: 0.3512 - val_loss: 0.1847 - val_mae: 0.3698
Epoch 468/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1700 - mae: 0.3519 - val_loss: 0.1850 - val_mae: 0.3712
Epoch 469/500
600/600 [==============================] - 0s 41us/sample - loss: 0.1697 - mae: 0.3515 - val_loss: 0.1847 - val_mae: 0.3700
Epoch 470/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1695 - mae: 0.3508 - val_loss: 0.1845 - val_mae: 0.3683
Epoch 471/500
600/600 [==============================] - 0s 44us/sample - loss: 0.1697 - mae: 0.3503 - val_loss: 0.1845 - val_mae: 0.3675
Epoch 472/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1697 - mae: 0.3502 - val_loss: 0.1845 - val_mae: 0.3682
Epoch 473/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1697 - mae: 0.3506 - val_loss: 0.1846 - val_mae: 0.3689
Epoch 474/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1696 - mae: 0.3505 - val_loss: 0.1845 - val_mae: 0.3682
Epoch 475/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1697 - mae: 0.3506 - val_loss: 0.1845 - val_mae: 0.3683
Epoch 476/500
600/600 [==============================] - 0s 44us/sample - loss: 0.1695 - mae: 0.3506 - val_loss: 0.1847 - val_mae: 0.3697
Epoch 477/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1696 - mae: 0.3511 - val_loss: 0.1848 - val_mae: 0.3701
Epoch 478/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1697 - mae: 0.3512 - val_loss: 0.1848 - val_mae: 0.3702
Epoch 479/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1695 - mae: 0.3507 - val_loss: 0.1845 - val_mae: 0.3676
Epoch 480/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1699 - mae: 0.3502 - val_loss: 0.1845 - val_mae: 0.3669
Epoch 481/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1697 - mae: 0.3500 - val_loss: 0.1845 - val_mae: 0.3676
Epoch 482/500
600/600 [==============================] - 0s 59us/sample - loss: 0.1695 - mae: 0.3506 - val_loss: 0.1850 - val_mae: 0.3706
Epoch 483/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1698 - mae: 0.3516 - val_loss: 0.1853 - val_mae: 0.3716
Epoch 484/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1699 - mae: 0.3515 - val_loss: 0.1847 - val_mae: 0.3692
Epoch 485/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3507 - val_loss: 0.1846 - val_mae: 0.3687
Epoch 486/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1699 - mae: 0.3505 - val_loss: 0.1845 - val_mae: 0.3679
Epoch 487/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1695 - mae: 0.3506 - val_loss: 0.1848 - val_mae: 0.3698
Epoch 488/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1701 - mae: 0.3517 - val_loss: 0.1851 - val_mae: 0.3709
Epoch 489/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1698 - mae: 0.3509 - val_loss: 0.1845 - val_mae: 0.3678
Epoch 490/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1696 - mae: 0.3502 - val_loss: 0.1846 - val_mae: 0.3680
Epoch 491/500
600/600 [==============================] - 0s 42us/sample - loss: 0.1696 - mae: 0.3502 - val_loss: 0.1846 - val_mae: 0.3683
Epoch 492/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1699 - mae: 0.3512 - val_loss: 0.1853 - val_mae: 0.3714
Epoch 493/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1698 - mae: 0.3513 - val_loss: 0.1848 - val_mae: 0.3697
Epoch 494/500
600/600 [==============================] - 0s 43us/sample - loss: 0.1696 - mae: 0.3509 - val_loss: 0.1847 - val_mae: 0.3691
Epoch 495/500
600/600 [==============================] - 0s 41us/sample - loss: 0.1695 - mae: 0.3504 - val_loss: 0.1845 - val_mae: 0.3679
Epoch 496/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1696 - mae: 0.3503 - val_loss: 0.1846 - val_mae: 0.3684
Epoch 497/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1695 - mae: 0.3505 - val_loss: 0.1847 - val_mae: 0.3693
Epoch 498/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1696 - mae: 0.3510 - val_loss: 0.1848 - val_mae: 0.3699
Epoch 499/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1695 - mae: 0.3508 - val_loss: 0.1846 - val_mae: 0.3690
Epoch 500/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1697 - mae: 0.3503 - val_loss: 0.1845 - val_mae: 0.3681

3. Plot Metrics

1. Mean Squared Error

During training, the model's performance is constantly being measured against both our training data and the validation data that we set aside earlier. Training produces a log of data that tells us how the model's performance changed over the course of the training process.

The following cells will display some of that data in a graphical form:


In [10]:
# Draw a graph of the loss, which is the distance between
# the predicted and actual values during training and validation.
loss = history_1.history['loss']
val_loss = history_1.history['val_loss']

epochs = range(1, len(loss) + 1)

plt.plot(epochs, loss, 'g.', label='Training loss')
plt.plot(epochs, val_loss, 'b', label='Validation loss')
plt.title('Training and validation loss')
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.legend()
plt.show()


The graph shows the loss (or the difference between the model's predictions and the actual data) for each epoch. There are several ways to calculate loss, and the method we have used is mean squared error. There is a distinct loss value given for the training and the validation data.

As we can see, the amount of loss rapidly decreases over the first 25 epochs, before flattening out. This means that the model is improving and producing more accurate predictions!

Our goal is to stop training when either the model is no longer improving, or when the training loss is less than the validation loss, which would mean that the model has learned to predict the training data so well that it can no longer generalize to new data.

To make the flatter part of the graph more readable, let's skip the first 50 epochs:


In [11]:
# Exclude the first few epochs so the graph is easier to read
SKIP = 50

plt.plot(epochs[SKIP:], loss[SKIP:], 'g.', label='Training loss')
plt.plot(epochs[SKIP:], val_loss[SKIP:], 'b.', label='Validation loss')
plt.title('Training and validation loss')
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.legend()
plt.show()


From the plot, we can see that loss continues to reduce until around 200 epochs, at which point it is mostly stable. This means that there's no need to train our network beyond 200 epochs.

However, we can also see that the lowest loss value is still around 0.155. This means that our network's predictions are off by an average of ~15%. In addition, the validation loss values jump around a lot, and is sometimes even higher.

2. Mean Absolute Error

To gain more insight into our model's performance we can plot some more data. This time, we'll plot the mean absolute error, which is another way of measuring how far the network's predictions are from the actual numbers:


In [12]:
plt.clf()

# Draw a graph of mean absolute error, which is another way of
# measuring the amount of error in the prediction.
mae = history_1.history['mae']
val_mae = history_1.history['val_mae']

plt.plot(epochs[SKIP:], mae[SKIP:], 'g.', label='Training MAE')
plt.plot(epochs[SKIP:], val_mae[SKIP:], 'b.', label='Validation MAE')
plt.title('Training and validation mean absolute error')
plt.xlabel('Epochs')
plt.ylabel('MAE')
plt.legend()
plt.show()


This graph of mean absolute error tells another story. We can see that training data shows consistently lower error than validation data, which means that the network may have overfit, or learned the training data so rigidly that it can't make effective predictions about new data.

In addition, the mean absolute error values are quite high, ~0.305 at best, which means some of the model's predictions are at least 30% off. A 30% error means we are very far from accurately modelling the sine wave function.

3. Actual vs Predicted Outputs

To get more insight into what is happening, let's check its predictions against the test dataset we set aside earlier:


In [13]:
# Calculate and print the loss on our test dataset
loss = model_1.evaluate(x_test, y_test)

# Make predictions based on our test dataset
predictions = model_1.predict(x_test)

# Graph the predictions against the actual values
plt.clf()
plt.title('Comparison of predictions and actual values')
plt.plot(x_test, y_test, 'b.', label='Actual')
plt.plot(x_test, predictions, 'r.', label='Predicted')
plt.legend()
plt.show()


200/1 [================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 57us/sample - loss: 0.1560 - mae: 0.3435

Oh dear! The graph makes it clear that our network has learned to approximate the sine function in a very limited way.

The rigidity of this fit suggests that the model does not have enough capacity to learn the full complexity of the sine wave function, so it's only able to approximate it in an overly simplistic way. By making our model bigger, we should be able to improve its performance.

Training a Larger Model

1. Design the Model

To make our model bigger, let's add an additional layer of neurons. The following cell redefines our model in the same way as earlier, but with 16 neurons in the first layer and an additional layer of 16 neurons in the middle:


In [0]:
model_2 = tf.keras.Sequential()

# First layer takes a scalar input and feeds it through 16 "neurons". The
# neurons decide whether to activate based on the 'relu' activation function.
model_2.add(keras.layers.Dense(16, activation='relu', input_shape=(1,)))

# The new second layer may help the network learn more complex representations
model_2.add(keras.layers.Dense(16, activation='relu'))

# Final layer is a single neuron, since we want to output a single value
model_2.add(keras.layers.Dense(1))

# Compile the model using a standard optimizer and loss function for regression
model_2.compile(optimizer='adam', loss='mse', metrics=['mae'])

2. Train the Model

We'll now train the new model.


In [15]:
history_2 = model_2.fit(x_train, y_train, epochs=500, batch_size=64,
                    validation_data=(x_validate, y_validate))


Train on 600 samples, validate on 200 samples
Epoch 1/500
600/600 [==============================] - 0s 736us/sample - loss: 0.4245 - mae: 0.5529 - val_loss: 0.4310 - val_mae: 0.5678
Epoch 2/500
600/600 [==============================] - 0s 64us/sample - loss: 0.4056 - mae: 0.5462 - val_loss: 0.4138 - val_mae: 0.5548
Epoch 3/500
600/600 [==============================] - 0s 52us/sample - loss: 0.3897 - mae: 0.5302 - val_loss: 0.3974 - val_mae: 0.5437
Epoch 4/500
600/600 [==============================] - 0s 51us/sample - loss: 0.3743 - mae: 0.5181 - val_loss: 0.3815 - val_mae: 0.5336
Epoch 5/500
600/600 [==============================] - 0s 54us/sample - loss: 0.3602 - mae: 0.5128 - val_loss: 0.3677 - val_mae: 0.5276
Epoch 6/500
600/600 [==============================] - 0s 51us/sample - loss: 0.3436 - mae: 0.5010 - val_loss: 0.3504 - val_mae: 0.5140
Epoch 7/500
600/600 [==============================] - 0s 52us/sample - loss: 0.3281 - mae: 0.4859 - val_loss: 0.3340 - val_mae: 0.5021
Epoch 8/500
600/600 [==============================] - 0s 57us/sample - loss: 0.3127 - mae: 0.4748 - val_loss: 0.3177 - val_mae: 0.4921
Epoch 9/500
600/600 [==============================] - 0s 54us/sample - loss: 0.2961 - mae: 0.4626 - val_loss: 0.3012 - val_mae: 0.4794
Epoch 10/500
600/600 [==============================] - 0s 49us/sample - loss: 0.2797 - mae: 0.4502 - val_loss: 0.2851 - val_mae: 0.4687
Epoch 11/500
600/600 [==============================] - 0s 49us/sample - loss: 0.2635 - mae: 0.4391 - val_loss: 0.2699 - val_mae: 0.4589
Epoch 12/500
600/600 [==============================] - 0s 48us/sample - loss: 0.2467 - mae: 0.4251 - val_loss: 0.2523 - val_mae: 0.4414
Epoch 13/500
600/600 [==============================] - 0s 46us/sample - loss: 0.2312 - mae: 0.4107 - val_loss: 0.2369 - val_mae: 0.4293
Epoch 14/500
600/600 [==============================] - 0s 49us/sample - loss: 0.2149 - mae: 0.3971 - val_loss: 0.2225 - val_mae: 0.4168
Epoch 15/500
600/600 [==============================] - 0s 52us/sample - loss: 0.2031 - mae: 0.3861 - val_loss: 0.2085 - val_mae: 0.4023
Epoch 16/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1908 - mae: 0.3716 - val_loss: 0.1970 - val_mae: 0.3899
Epoch 17/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1777 - mae: 0.3590 - val_loss: 0.1881 - val_mae: 0.3810
Epoch 18/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1682 - mae: 0.3475 - val_loss: 0.1789 - val_mae: 0.3677
Epoch 19/500
600/600 [==============================] - 0s 47us/sample - loss: 0.1603 - mae: 0.3367 - val_loss: 0.1723 - val_mae: 0.3586
Epoch 20/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1536 - mae: 0.3276 - val_loss: 0.1668 - val_mae: 0.3500
Epoch 21/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1487 - mae: 0.3181 - val_loss: 0.1619 - val_mae: 0.3403
Epoch 22/500
600/600 [==============================] - 0s 74us/sample - loss: 0.1433 - mae: 0.3108 - val_loss: 0.1598 - val_mae: 0.3358
Epoch 23/500
600/600 [==============================] - 0s 58us/sample - loss: 0.1418 - mae: 0.3072 - val_loss: 0.1558 - val_mae: 0.3248
Epoch 24/500
600/600 [==============================] - 0s 57us/sample - loss: 0.1389 - mae: 0.2992 - val_loss: 0.1538 - val_mae: 0.3189
Epoch 25/500
600/600 [==============================] - 0s 55us/sample - loss: 0.1387 - mae: 0.2978 - val_loss: 0.1524 - val_mae: 0.3161
Epoch 26/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1346 - mae: 0.2904 - val_loss: 0.1510 - val_mae: 0.3112
Epoch 27/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1340 - mae: 0.2904 - val_loss: 0.1501 - val_mae: 0.3098
Epoch 28/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1313 - mae: 0.2849 - val_loss: 0.1489 - val_mae: 0.3042
Epoch 29/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1303 - mae: 0.2830 - val_loss: 0.1489 - val_mae: 0.3058
Epoch 30/500
600/600 [==============================] - 0s 63us/sample - loss: 0.1292 - mae: 0.2804 - val_loss: 0.1474 - val_mae: 0.2997
Epoch 31/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1286 - mae: 0.2781 - val_loss: 0.1467 - val_mae: 0.2998
Epoch 32/500
600/600 [==============================] - 0s 56us/sample - loss: 0.1274 - mae: 0.2774 - val_loss: 0.1463 - val_mae: 0.2990
Epoch 33/500
600/600 [==============================] - 0s 62us/sample - loss: 0.1268 - mae: 0.2758 - val_loss: 0.1451 - val_mae: 0.2945
Epoch 34/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1295 - mae: 0.2746 - val_loss: 0.1449 - val_mae: 0.2966
Epoch 35/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1278 - mae: 0.2760 - val_loss: 0.1438 - val_mae: 0.2937
Epoch 36/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1246 - mae: 0.2710 - val_loss: 0.1431 - val_mae: 0.2908
Epoch 37/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1247 - mae: 0.2693 - val_loss: 0.1434 - val_mae: 0.2939
Epoch 38/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1237 - mae: 0.2702 - val_loss: 0.1415 - val_mae: 0.2893
Epoch 39/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1263 - mae: 0.2691 - val_loss: 0.1411 - val_mae: 0.2891
Epoch 40/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1238 - mae: 0.2693 - val_loss: 0.1408 - val_mae: 0.2906
Epoch 41/500
600/600 [==============================] - 0s 48us/sample - loss: 0.1209 - mae: 0.2659 - val_loss: 0.1393 - val_mae: 0.2859
Epoch 42/500
600/600 [==============================] - 0s 46us/sample - loss: 0.1216 - mae: 0.2644 - val_loss: 0.1387 - val_mae: 0.2842
Epoch 43/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1200 - mae: 0.2642 - val_loss: 0.1386 - val_mae: 0.2869
Epoch 44/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1193 - mae: 0.2626 - val_loss: 0.1370 - val_mae: 0.2814
Epoch 45/500
600/600 [==============================] - 0s 55us/sample - loss: 0.1187 - mae: 0.2625 - val_loss: 0.1362 - val_mae: 0.2829
Epoch 46/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1177 - mae: 0.2593 - val_loss: 0.1353 - val_mae: 0.2796
Epoch 47/500
600/600 [==============================] - 0s 45us/sample - loss: 0.1172 - mae: 0.2598 - val_loss: 0.1346 - val_mae: 0.2789
Epoch 48/500
600/600 [==============================] - 0s 51us/sample - loss: 0.1158 - mae: 0.2569 - val_loss: 0.1337 - val_mae: 0.2769
Epoch 49/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1176 - mae: 0.2590 - val_loss: 0.1329 - val_mae: 0.2761
Epoch 50/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1141 - mae: 0.2544 - val_loss: 0.1320 - val_mae: 0.2759
Epoch 51/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1138 - mae: 0.2536 - val_loss: 0.1312 - val_mae: 0.2741
Epoch 52/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1127 - mae: 0.2535 - val_loss: 0.1313 - val_mae: 0.2776
Epoch 53/500
600/600 [==============================] - 0s 60us/sample - loss: 0.1124 - mae: 0.2518 - val_loss: 0.1294 - val_mae: 0.2708
Epoch 54/500
600/600 [==============================] - 0s 61us/sample - loss: 0.1115 - mae: 0.2508 - val_loss: 0.1287 - val_mae: 0.2722
Epoch 55/500
600/600 [==============================] - 0s 52us/sample - loss: 0.1103 - mae: 0.2487 - val_loss: 0.1278 - val_mae: 0.2709
Epoch 56/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1094 - mae: 0.2485 - val_loss: 0.1267 - val_mae: 0.2687
Epoch 57/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1090 - mae: 0.2479 - val_loss: 0.1259 - val_mae: 0.2684
Epoch 58/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1118 - mae: 0.2456 - val_loss: 0.1256 - val_mae: 0.2695
Epoch 59/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1106 - mae: 0.2500 - val_loss: 0.1243 - val_mae: 0.2670
Epoch 60/500
600/600 [==============================] - 0s 59us/sample - loss: 0.1071 - mae: 0.2429 - val_loss: 0.1231 - val_mae: 0.2626
Epoch 61/500
600/600 [==============================] - 0s 53us/sample - loss: 0.1059 - mae: 0.2436 - val_loss: 0.1226 - val_mae: 0.2653
Epoch 62/500
600/600 [==============================] - 0s 54us/sample - loss: 0.1048 - mae: 0.2419 - val_loss: 0.1213 - val_mae: 0.2607
Epoch 63/500
600/600 [==============================] - 0s 65us/sample - loss: 0.1038 - mae: 0.2394 - val_loss: 0.1204 - val_mae: 0.2604
Epoch 64/500
600/600 [==============================] - 0s 49us/sample - loss: 0.1029 - mae: 0.2383 - val_loss: 0.1196 - val_mae: 0.2593
Epoch 65/500
600/600 [==============================] - 0s 50us/sample - loss: 0.1021 - mae: 0.2376 - val_loss: 0.1186 - val_mae: 0.2576
Epoch 66/500
600/600 [==============================] - 0s 57us/sample - loss: 0.1012 - mae: 0.2353 - val_loss: 0.1179 - val_mae: 0.2585
Epoch 67/500
600/600 [==============================] - 0s 57us/sample - loss: 0.1006 - mae: 0.2358 - val_loss: 0.1169 - val_mae: 0.2568
Epoch 68/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0996 - mae: 0.2346 - val_loss: 0.1158 - val_mae: 0.2553
Epoch 69/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0996 - mae: 0.2349 - val_loss: 0.1148 - val_mae: 0.2534
Epoch 70/500
600/600 [==============================] - 0s 62us/sample - loss: 0.0985 - mae: 0.2316 - val_loss: 0.1142 - val_mae: 0.2490
Epoch 71/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0986 - mae: 0.2327 - val_loss: 0.1144 - val_mae: 0.2559
Epoch 72/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0981 - mae: 0.2306 - val_loss: 0.1121 - val_mae: 0.2494
Epoch 73/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0966 - mae: 0.2308 - val_loss: 0.1118 - val_mae: 0.2521
Epoch 74/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0972 - mae: 0.2281 - val_loss: 0.1104 - val_mae: 0.2456
Epoch 75/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0960 - mae: 0.2293 - val_loss: 0.1101 - val_mae: 0.2500
Epoch 76/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0933 - mae: 0.2247 - val_loss: 0.1087 - val_mae: 0.2424
Epoch 77/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0922 - mae: 0.2221 - val_loss: 0.1080 - val_mae: 0.2453
Epoch 78/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0917 - mae: 0.2235 - val_loss: 0.1069 - val_mae: 0.2432
Epoch 79/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0922 - mae: 0.2204 - val_loss: 0.1061 - val_mae: 0.2394
Epoch 80/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0918 - mae: 0.2239 - val_loss: 0.1062 - val_mae: 0.2456
Epoch 81/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0908 - mae: 0.2220 - val_loss: 0.1048 - val_mae: 0.2372
Epoch 82/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0889 - mae: 0.2193 - val_loss: 0.1046 - val_mae: 0.2421
Epoch 83/500
600/600 [==============================] - 0s 62us/sample - loss: 0.0883 - mae: 0.2175 - val_loss: 0.1029 - val_mae: 0.2339
Epoch 84/500
600/600 [==============================] - 0s 64us/sample - loss: 0.0872 - mae: 0.2143 - val_loss: 0.1022 - val_mae: 0.2372
Epoch 85/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0865 - mae: 0.2148 - val_loss: 0.1012 - val_mae: 0.2342
Epoch 86/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0856 - mae: 0.2124 - val_loss: 0.1004 - val_mae: 0.2317
Epoch 87/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0850 - mae: 0.2122 - val_loss: 0.0998 - val_mae: 0.2340
Epoch 88/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0843 - mae: 0.2121 - val_loss: 0.0987 - val_mae: 0.2312
Epoch 89/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0836 - mae: 0.2103 - val_loss: 0.0981 - val_mae: 0.2313
Epoch 90/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0832 - mae: 0.2113 - val_loss: 0.0971 - val_mae: 0.2288
Epoch 91/500
600/600 [==============================] - 0s 58us/sample - loss: 0.0830 - mae: 0.2066 - val_loss: 0.0970 - val_mae: 0.2238
Epoch 92/500
600/600 [==============================] - 0s 70us/sample - loss: 0.0829 - mae: 0.2111 - val_loss: 0.0965 - val_mae: 0.2311
Epoch 93/500
600/600 [==============================] - 0s 69us/sample - loss: 0.0813 - mae: 0.2068 - val_loss: 0.0959 - val_mae: 0.2234
Epoch 94/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0816 - mae: 0.2070 - val_loss: 0.0950 - val_mae: 0.2288
Epoch 95/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0817 - mae: 0.2036 - val_loss: 0.0940 - val_mae: 0.2189
Epoch 96/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0803 - mae: 0.2064 - val_loss: 0.0929 - val_mae: 0.2243
Epoch 97/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0795 - mae: 0.2018 - val_loss: 0.0919 - val_mae: 0.2201
Epoch 98/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0773 - mae: 0.2024 - val_loss: 0.0930 - val_mae: 0.2276
Epoch 99/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0780 - mae: 0.2015 - val_loss: 0.0905 - val_mae: 0.2205
Epoch 100/500
600/600 [==============================] - 0s 59us/sample - loss: 0.0770 - mae: 0.2025 - val_loss: 0.0900 - val_mae: 0.2220
Epoch 101/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0768 - mae: 0.1993 - val_loss: 0.0892 - val_mae: 0.2146
Epoch 102/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0783 - mae: 0.2039 - val_loss: 0.0885 - val_mae: 0.2191
Epoch 103/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0748 - mae: 0.1963 - val_loss: 0.0876 - val_mae: 0.2149
Epoch 104/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0743 - mae: 0.1978 - val_loss: 0.0873 - val_mae: 0.2179
Epoch 105/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0733 - mae: 0.1952 - val_loss: 0.0865 - val_mae: 0.2114
Epoch 106/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0733 - mae: 0.1943 - val_loss: 0.0862 - val_mae: 0.2131
Epoch 107/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0723 - mae: 0.1936 - val_loss: 0.0848 - val_mae: 0.2112
Epoch 108/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0715 - mae: 0.1927 - val_loss: 0.0843 - val_mae: 0.2125
Epoch 109/500
600/600 [==============================] - 0s 44us/sample - loss: 0.0714 - mae: 0.1903 - val_loss: 0.0836 - val_mae: 0.2100
Epoch 110/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0719 - mae: 0.1952 - val_loss: 0.0830 - val_mae: 0.2111
Epoch 111/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0714 - mae: 0.1895 - val_loss: 0.0824 - val_mae: 0.2072
Epoch 112/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0699 - mae: 0.1929 - val_loss: 0.0823 - val_mae: 0.2110
Epoch 113/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0699 - mae: 0.1891 - val_loss: 0.0810 - val_mae: 0.2053
Epoch 114/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0691 - mae: 0.1898 - val_loss: 0.0805 - val_mae: 0.2074
Epoch 115/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0678 - mae: 0.1859 - val_loss: 0.0798 - val_mae: 0.2025
Epoch 116/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0674 - mae: 0.1880 - val_loss: 0.0794 - val_mae: 0.2061
Epoch 117/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0672 - mae: 0.1844 - val_loss: 0.0785 - val_mae: 0.2008
Epoch 118/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0663 - mae: 0.1848 - val_loss: 0.0780 - val_mae: 0.2038
Epoch 119/500
600/600 [==============================] - 0s 44us/sample - loss: 0.0657 - mae: 0.1830 - val_loss: 0.0772 - val_mae: 0.2003
Epoch 120/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0649 - mae: 0.1813 - val_loss: 0.0767 - val_mae: 0.2002
Epoch 121/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0654 - mae: 0.1845 - val_loss: 0.0761 - val_mae: 0.1997
Epoch 122/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0642 - mae: 0.1815 - val_loss: 0.0755 - val_mae: 0.1991
Epoch 123/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0635 - mae: 0.1807 - val_loss: 0.0750 - val_mae: 0.1955
Epoch 124/500
600/600 [==============================] - 0s 58us/sample - loss: 0.0635 - mae: 0.1779 - val_loss: 0.0744 - val_mae: 0.1981
Epoch 125/500
600/600 [==============================] - 0s 60us/sample - loss: 0.0642 - mae: 0.1844 - val_loss: 0.0738 - val_mae: 0.1968
Epoch 126/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0659 - mae: 0.1780 - val_loss: 0.0739 - val_mae: 0.1973
Epoch 127/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0622 - mae: 0.1817 - val_loss: 0.0731 - val_mae: 0.1985
Epoch 128/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0619 - mae: 0.1772 - val_loss: 0.0722 - val_mae: 0.1936
Epoch 129/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0607 - mae: 0.1764 - val_loss: 0.0718 - val_mae: 0.1946
Epoch 130/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0602 - mae: 0.1747 - val_loss: 0.0710 - val_mae: 0.1925
Epoch 131/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0600 - mae: 0.1748 - val_loss: 0.0706 - val_mae: 0.1923
Epoch 132/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0592 - mae: 0.1743 - val_loss: 0.0699 - val_mae: 0.1913
Epoch 133/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0594 - mae: 0.1722 - val_loss: 0.0695 - val_mae: 0.1901
Epoch 134/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0589 - mae: 0.1753 - val_loss: 0.0690 - val_mae: 0.1903
Epoch 135/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0587 - mae: 0.1702 - val_loss: 0.0684 - val_mae: 0.1886
Epoch 136/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0575 - mae: 0.1725 - val_loss: 0.0682 - val_mae: 0.1908
Epoch 137/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0570 - mae: 0.1704 - val_loss: 0.0676 - val_mae: 0.1871
Epoch 138/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0567 - mae: 0.1692 - val_loss: 0.0671 - val_mae: 0.1879
Epoch 139/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0562 - mae: 0.1692 - val_loss: 0.0663 - val_mae: 0.1848
Epoch 140/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0558 - mae: 0.1676 - val_loss: 0.0658 - val_mae: 0.1847
Epoch 141/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0553 - mae: 0.1663 - val_loss: 0.0653 - val_mae: 0.1840
Epoch 142/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0552 - mae: 0.1665 - val_loss: 0.0650 - val_mae: 0.1850
Epoch 143/500
600/600 [==============================] - 0s 58us/sample - loss: 0.0550 - mae: 0.1688 - val_loss: 0.0642 - val_mae: 0.1831
Epoch 144/500
600/600 [==============================] - 0s 59us/sample - loss: 0.0542 - mae: 0.1647 - val_loss: 0.0640 - val_mae: 0.1820
Epoch 145/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0536 - mae: 0.1644 - val_loss: 0.0633 - val_mae: 0.1812
Epoch 146/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0533 - mae: 0.1646 - val_loss: 0.0628 - val_mae: 0.1820
Epoch 147/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0527 - mae: 0.1630 - val_loss: 0.0623 - val_mae: 0.1803
Epoch 148/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0524 - mae: 0.1620 - val_loss: 0.0620 - val_mae: 0.1809
Epoch 149/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0519 - mae: 0.1624 - val_loss: 0.0613 - val_mae: 0.1798
Epoch 150/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0527 - mae: 0.1629 - val_loss: 0.0610 - val_mae: 0.1798
Epoch 151/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0515 - mae: 0.1605 - val_loss: 0.0609 - val_mae: 0.1752
Epoch 152/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0511 - mae: 0.1609 - val_loss: 0.0602 - val_mae: 0.1788
Epoch 153/500
600/600 [==============================] - 0s 58us/sample - loss: 0.0506 - mae: 0.1594 - val_loss: 0.0594 - val_mae: 0.1786
Epoch 154/500
600/600 [==============================] - 0s 64us/sample - loss: 0.0501 - mae: 0.1607 - val_loss: 0.0589 - val_mae: 0.1763
Epoch 155/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0497 - mae: 0.1576 - val_loss: 0.0587 - val_mae: 0.1762
Epoch 156/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0493 - mae: 0.1585 - val_loss: 0.0581 - val_mae: 0.1756
Epoch 157/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0489 - mae: 0.1575 - val_loss: 0.0581 - val_mae: 0.1780
Epoch 158/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0486 - mae: 0.1582 - val_loss: 0.0574 - val_mae: 0.1728
Epoch 159/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0488 - mae: 0.1552 - val_loss: 0.0576 - val_mae: 0.1777
Epoch 160/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0501 - mae: 0.1633 - val_loss: 0.0567 - val_mae: 0.1750
Epoch 161/500
600/600 [==============================] - 0s 44us/sample - loss: 0.0481 - mae: 0.1568 - val_loss: 0.0562 - val_mae: 0.1750
Epoch 162/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0476 - mae: 0.1569 - val_loss: 0.0553 - val_mae: 0.1706
Epoch 163/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0464 - mae: 0.1533 - val_loss: 0.0549 - val_mae: 0.1717
Epoch 164/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0470 - mae: 0.1559 - val_loss: 0.0550 - val_mae: 0.1696
Epoch 165/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0463 - mae: 0.1526 - val_loss: 0.0543 - val_mae: 0.1669
Epoch 166/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0467 - mae: 0.1530 - val_loss: 0.0536 - val_mae: 0.1685
Epoch 167/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0465 - mae: 0.1521 - val_loss: 0.0536 - val_mae: 0.1691
Epoch 168/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0462 - mae: 0.1570 - val_loss: 0.0530 - val_mae: 0.1681
Epoch 169/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0448 - mae: 0.1514 - val_loss: 0.0523 - val_mae: 0.1679
Epoch 170/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0441 - mae: 0.1509 - val_loss: 0.0518 - val_mae: 0.1668
Epoch 171/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0438 - mae: 0.1488 - val_loss: 0.0516 - val_mae: 0.1668
Epoch 172/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0437 - mae: 0.1509 - val_loss: 0.0510 - val_mae: 0.1649
Epoch 173/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0431 - mae: 0.1479 - val_loss: 0.0507 - val_mae: 0.1658
Epoch 174/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0432 - mae: 0.1493 - val_loss: 0.0503 - val_mae: 0.1634
Epoch 175/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0427 - mae: 0.1467 - val_loss: 0.0502 - val_mae: 0.1667
Epoch 176/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0425 - mae: 0.1475 - val_loss: 0.0494 - val_mae: 0.1618
Epoch 177/500
600/600 [==============================] - 0s 43us/sample - loss: 0.0426 - mae: 0.1497 - val_loss: 0.0491 - val_mae: 0.1618
Epoch 178/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0416 - mae: 0.1454 - val_loss: 0.0489 - val_mae: 0.1635
Epoch 179/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0414 - mae: 0.1467 - val_loss: 0.0483 - val_mae: 0.1599
Epoch 180/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0411 - mae: 0.1439 - val_loss: 0.0489 - val_mae: 0.1651
Epoch 181/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0418 - mae: 0.1485 - val_loss: 0.0477 - val_mae: 0.1597
Epoch 182/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0405 - mae: 0.1445 - val_loss: 0.0473 - val_mae: 0.1612
Epoch 183/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0399 - mae: 0.1435 - val_loss: 0.0466 - val_mae: 0.1579
Epoch 184/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0399 - mae: 0.1432 - val_loss: 0.0465 - val_mae: 0.1561
Epoch 185/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0397 - mae: 0.1437 - val_loss: 0.0459 - val_mae: 0.1573
Epoch 186/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0394 - mae: 0.1424 - val_loss: 0.0455 - val_mae: 0.1582
Epoch 187/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0385 - mae: 0.1411 - val_loss: 0.0453 - val_mae: 0.1544
Epoch 188/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0385 - mae: 0.1403 - val_loss: 0.0447 - val_mae: 0.1545
Epoch 189/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0381 - mae: 0.1392 - val_loss: 0.0444 - val_mae: 0.1549
Epoch 190/500
600/600 [==============================] - 0s 61us/sample - loss: 0.0378 - mae: 0.1402 - val_loss: 0.0441 - val_mae: 0.1529
Epoch 191/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0376 - mae: 0.1390 - val_loss: 0.0441 - val_mae: 0.1574
Epoch 192/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0378 - mae: 0.1397 - val_loss: 0.0431 - val_mae: 0.1533
Epoch 193/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0376 - mae: 0.1401 - val_loss: 0.0430 - val_mae: 0.1538
Epoch 194/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0372 - mae: 0.1376 - val_loss: 0.0433 - val_mae: 0.1548
Epoch 195/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0376 - mae: 0.1412 - val_loss: 0.0429 - val_mae: 0.1508
Epoch 196/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0365 - mae: 0.1383 - val_loss: 0.0419 - val_mae: 0.1529
Epoch 197/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0361 - mae: 0.1353 - val_loss: 0.0416 - val_mae: 0.1485
Epoch 198/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0354 - mae: 0.1353 - val_loss: 0.0411 - val_mae: 0.1506
Epoch 199/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0354 - mae: 0.1363 - val_loss: 0.0410 - val_mae: 0.1504
Epoch 200/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0354 - mae: 0.1358 - val_loss: 0.0410 - val_mae: 0.1511
Epoch 201/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0348 - mae: 0.1349 - val_loss: 0.0399 - val_mae: 0.1475
Epoch 202/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0345 - mae: 0.1342 - val_loss: 0.0396 - val_mae: 0.1476
Epoch 203/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0342 - mae: 0.1345 - val_loss: 0.0395 - val_mae: 0.1455
Epoch 204/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0337 - mae: 0.1321 - val_loss: 0.0390 - val_mae: 0.1462
Epoch 205/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0336 - mae: 0.1328 - val_loss: 0.0389 - val_mae: 0.1445
Epoch 206/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0337 - mae: 0.1317 - val_loss: 0.0392 - val_mae: 0.1497
Epoch 207/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0335 - mae: 0.1326 - val_loss: 0.0384 - val_mae: 0.1436
Epoch 208/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0329 - mae: 0.1310 - val_loss: 0.0376 - val_mae: 0.1444
Epoch 209/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0328 - mae: 0.1298 - val_loss: 0.0375 - val_mae: 0.1454
Epoch 210/500
600/600 [==============================] - 0s 44us/sample - loss: 0.0328 - mae: 0.1328 - val_loss: 0.0370 - val_mae: 0.1432
Epoch 211/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0331 - mae: 0.1310 - val_loss: 0.0369 - val_mae: 0.1413
Epoch 212/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0317 - mae: 0.1290 - val_loss: 0.0367 - val_mae: 0.1449
Epoch 213/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0318 - mae: 0.1291 - val_loss: 0.0360 - val_mae: 0.1425
Epoch 214/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0312 - mae: 0.1284 - val_loss: 0.0356 - val_mae: 0.1413
Epoch 215/500
600/600 [==============================] - 0s 65us/sample - loss: 0.0309 - mae: 0.1273 - val_loss: 0.0356 - val_mae: 0.1423
Epoch 216/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0310 - mae: 0.1280 - val_loss: 0.0350 - val_mae: 0.1396
Epoch 217/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0303 - mae: 0.1263 - val_loss: 0.0346 - val_mae: 0.1400
Epoch 218/500
600/600 [==============================] - 0s 59us/sample - loss: 0.0302 - mae: 0.1267 - val_loss: 0.0343 - val_mae: 0.1390
Epoch 219/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0299 - mae: 0.1258 - val_loss: 0.0340 - val_mae: 0.1377
Epoch 220/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0299 - mae: 0.1262 - val_loss: 0.0338 - val_mae: 0.1374
Epoch 221/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0294 - mae: 0.1246 - val_loss: 0.0337 - val_mae: 0.1395
Epoch 222/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0299 - mae: 0.1275 - val_loss: 0.0340 - val_mae: 0.1394
Epoch 223/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0295 - mae: 0.1251 - val_loss: 0.0331 - val_mae: 0.1378
Epoch 224/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0290 - mae: 0.1228 - val_loss: 0.0325 - val_mae: 0.1361
Epoch 225/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0291 - mae: 0.1254 - val_loss: 0.0321 - val_mae: 0.1344
Epoch 226/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0286 - mae: 0.1237 - val_loss: 0.0318 - val_mae: 0.1340
Epoch 227/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0281 - mae: 0.1219 - val_loss: 0.0315 - val_mae: 0.1331
Epoch 228/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0280 - mae: 0.1221 - val_loss: 0.0313 - val_mae: 0.1345
Epoch 229/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0276 - mae: 0.1202 - val_loss: 0.0310 - val_mae: 0.1333
Epoch 230/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0276 - mae: 0.1215 - val_loss: 0.0308 - val_mae: 0.1313
Epoch 231/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0274 - mae: 0.1214 - val_loss: 0.0319 - val_mae: 0.1382
Epoch 232/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0281 - mae: 0.1242 - val_loss: 0.0304 - val_mae: 0.1305
Epoch 233/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0268 - mae: 0.1195 - val_loss: 0.0299 - val_mae: 0.1320
Epoch 234/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0264 - mae: 0.1187 - val_loss: 0.0296 - val_mae: 0.1302
Epoch 235/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0267 - mae: 0.1206 - val_loss: 0.0299 - val_mae: 0.1285
Epoch 236/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0272 - mae: 0.1182 - val_loss: 0.0309 - val_mae: 0.1363
Epoch 237/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0273 - mae: 0.1209 - val_loss: 0.0286 - val_mae: 0.1297
Epoch 238/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0260 - mae: 0.1191 - val_loss: 0.0286 - val_mae: 0.1276
Epoch 239/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0259 - mae: 0.1173 - val_loss: 0.0283 - val_mae: 0.1279
Epoch 240/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0255 - mae: 0.1157 - val_loss: 0.0279 - val_mae: 0.1281
Epoch 241/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0253 - mae: 0.1162 - val_loss: 0.0280 - val_mae: 0.1294
Epoch 242/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0256 - mae: 0.1178 - val_loss: 0.0273 - val_mae: 0.1259
Epoch 243/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0245 - mae: 0.1144 - val_loss: 0.0276 - val_mae: 0.1287
Epoch 244/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0252 - mae: 0.1163 - val_loss: 0.0268 - val_mae: 0.1263
Epoch 245/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0261 - mae: 0.1201 - val_loss: 0.0295 - val_mae: 0.1333
Epoch 246/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0268 - mae: 0.1231 - val_loss: 0.0279 - val_mae: 0.1302
Epoch 247/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0240 - mae: 0.1149 - val_loss: 0.0263 - val_mae: 0.1242
Epoch 248/500
600/600 [==============================] - 0s 66us/sample - loss: 0.0242 - mae: 0.1146 - val_loss: 0.0259 - val_mae: 0.1249
Epoch 249/500
600/600 [==============================] - 0s 69us/sample - loss: 0.0233 - mae: 0.1129 - val_loss: 0.0277 - val_mae: 0.1258
Epoch 250/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0246 - mae: 0.1158 - val_loss: 0.0255 - val_mae: 0.1237
Epoch 251/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0231 - mae: 0.1114 - val_loss: 0.0249 - val_mae: 0.1216
Epoch 252/500
600/600 [==============================] - 0s 63us/sample - loss: 0.0230 - mae: 0.1122 - val_loss: 0.0246 - val_mae: 0.1216
Epoch 253/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0229 - mae: 0.1109 - val_loss: 0.0247 - val_mae: 0.1228
Epoch 254/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0230 - mae: 0.1122 - val_loss: 0.0242 - val_mae: 0.1204
Epoch 255/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0233 - mae: 0.1139 - val_loss: 0.0252 - val_mae: 0.1209
Epoch 256/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0225 - mae: 0.1102 - val_loss: 0.0239 - val_mae: 0.1197
Epoch 257/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0219 - mae: 0.1086 - val_loss: 0.0235 - val_mae: 0.1197
Epoch 258/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0217 - mae: 0.1091 - val_loss: 0.0234 - val_mae: 0.1188
Epoch 259/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0215 - mae: 0.1082 - val_loss: 0.0231 - val_mae: 0.1184
Epoch 260/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0214 - mae: 0.1080 - val_loss: 0.0228 - val_mae: 0.1183
Epoch 261/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0214 - mae: 0.1081 - val_loss: 0.0226 - val_mae: 0.1175
Epoch 262/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0211 - mae: 0.1077 - val_loss: 0.0224 - val_mae: 0.1177
Epoch 263/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0210 - mae: 0.1075 - val_loss: 0.0223 - val_mae: 0.1176
Epoch 264/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0209 - mae: 0.1079 - val_loss: 0.0223 - val_mae: 0.1164
Epoch 265/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0208 - mae: 0.1073 - val_loss: 0.0219 - val_mae: 0.1165
Epoch 266/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0209 - mae: 0.1084 - val_loss: 0.0221 - val_mae: 0.1149
Epoch 267/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0206 - mae: 0.1075 - val_loss: 0.0215 - val_mae: 0.1148
Epoch 268/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0203 - mae: 0.1062 - val_loss: 0.0212 - val_mae: 0.1142
Epoch 269/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0201 - mae: 0.1055 - val_loss: 0.0212 - val_mae: 0.1141
Epoch 270/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0200 - mae: 0.1063 - val_loss: 0.0213 - val_mae: 0.1137
Epoch 271/500
600/600 [==============================] - 0s 58us/sample - loss: 0.0201 - mae: 0.1066 - val_loss: 0.0211 - val_mae: 0.1127
Epoch 272/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0205 - mae: 0.1074 - val_loss: 0.0203 - val_mae: 0.1131
Epoch 273/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0197 - mae: 0.1052 - val_loss: 0.0202 - val_mae: 0.1123
Epoch 274/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0194 - mae: 0.1043 - val_loss: 0.0201 - val_mae: 0.1119
Epoch 275/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0192 - mae: 0.1038 - val_loss: 0.0199 - val_mae: 0.1118
Epoch 276/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0191 - mae: 0.1040 - val_loss: 0.0200 - val_mae: 0.1113
Epoch 277/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0191 - mae: 0.1043 - val_loss: 0.0199 - val_mae: 0.1117
Epoch 278/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0194 - mae: 0.1051 - val_loss: 0.0195 - val_mae: 0.1111
Epoch 279/500
600/600 [==============================] - 0s 65us/sample - loss: 0.0186 - mae: 0.1031 - val_loss: 0.0197 - val_mae: 0.1098
Epoch 280/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0187 - mae: 0.1031 - val_loss: 0.0192 - val_mae: 0.1103
Epoch 281/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0186 - mae: 0.1031 - val_loss: 0.0192 - val_mae: 0.1098
Epoch 282/500
600/600 [==============================] - 0s 63us/sample - loss: 0.0185 - mae: 0.1031 - val_loss: 0.0190 - val_mae: 0.1092
Epoch 283/500
600/600 [==============================] - 0s 60us/sample - loss: 0.0183 - mae: 0.1022 - val_loss: 0.0188 - val_mae: 0.1097
Epoch 284/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0181 - mae: 0.1020 - val_loss: 0.0186 - val_mae: 0.1086
Epoch 285/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0183 - mae: 0.1025 - val_loss: 0.0192 - val_mae: 0.1085
Epoch 286/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0190 - mae: 0.1057 - val_loss: 0.0190 - val_mae: 0.1106
Epoch 287/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0181 - mae: 0.1022 - val_loss: 0.0181 - val_mae: 0.1077
Epoch 288/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0177 - mae: 0.1012 - val_loss: 0.0181 - val_mae: 0.1072
Epoch 289/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0175 - mae: 0.1003 - val_loss: 0.0182 - val_mae: 0.1082
Epoch 290/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0180 - mae: 0.1028 - val_loss: 0.0179 - val_mae: 0.1064
Epoch 291/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0175 - mae: 0.1013 - val_loss: 0.0179 - val_mae: 0.1063
Epoch 292/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0175 - mae: 0.1014 - val_loss: 0.0177 - val_mae: 0.1067
Epoch 293/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0176 - mae: 0.1018 - val_loss: 0.0171 - val_mae: 0.1051
Epoch 294/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0175 - mae: 0.1010 - val_loss: 0.0175 - val_mae: 0.1050
Epoch 295/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0176 - mae: 0.1015 - val_loss: 0.0174 - val_mae: 0.1056
Epoch 296/500
600/600 [==============================] - 0s 58us/sample - loss: 0.0173 - mae: 0.1017 - val_loss: 0.0172 - val_mae: 0.1040
Epoch 297/500
600/600 [==============================] - 0s 63us/sample - loss: 0.0168 - mae: 0.0999 - val_loss: 0.0169 - val_mae: 0.1046
Epoch 298/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0169 - mae: 0.1001 - val_loss: 0.0166 - val_mae: 0.1035
Epoch 299/500
600/600 [==============================] - 0s 141us/sample - loss: 0.0168 - mae: 0.0994 - val_loss: 0.0168 - val_mae: 0.1035
Epoch 300/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0166 - mae: 0.0999 - val_loss: 0.0162 - val_mae: 0.1026
Epoch 301/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0164 - mae: 0.0985 - val_loss: 0.0164 - val_mae: 0.1026
Epoch 302/500
600/600 [==============================] - 0s 58us/sample - loss: 0.0162 - mae: 0.0988 - val_loss: 0.0165 - val_mae: 0.1026
Epoch 303/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0164 - mae: 0.0989 - val_loss: 0.0161 - val_mae: 0.1022
Epoch 304/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0163 - mae: 0.0988 - val_loss: 0.0161 - val_mae: 0.1026
Epoch 305/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0163 - mae: 0.0993 - val_loss: 0.0158 - val_mae: 0.1015
Epoch 306/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0162 - mae: 0.0989 - val_loss: 0.0161 - val_mae: 0.1020
Epoch 307/500
600/600 [==============================] - 0s 76us/sample - loss: 0.0166 - mae: 0.1004 - val_loss: 0.0158 - val_mae: 0.1011
Epoch 308/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0160 - mae: 0.0984 - val_loss: 0.0158 - val_mae: 0.1004
Epoch 309/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0160 - mae: 0.0983 - val_loss: 0.0160 - val_mae: 0.1012
Epoch 310/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0170 - mae: 0.1013 - val_loss: 0.0159 - val_mae: 0.1016
Epoch 311/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0160 - mae: 0.0983 - val_loss: 0.0192 - val_mae: 0.1091
Epoch 312/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0185 - mae: 0.1053 - val_loss: 0.0153 - val_mae: 0.1004
Epoch 313/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0161 - mae: 0.0997 - val_loss: 0.0162 - val_mae: 0.1010
Epoch 314/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0153 - mae: 0.0966 - val_loss: 0.0154 - val_mae: 0.1006
Epoch 315/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0162 - mae: 0.1002 - val_loss: 0.0152 - val_mae: 0.0999
Epoch 316/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0162 - mae: 0.0999 - val_loss: 0.0158 - val_mae: 0.0996
Epoch 317/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0158 - mae: 0.0985 - val_loss: 0.0170 - val_mae: 0.1026
Epoch 318/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0167 - mae: 0.1021 - val_loss: 0.0148 - val_mae: 0.0981
Epoch 319/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0161 - mae: 0.0994 - val_loss: 0.0157 - val_mae: 0.1011
Epoch 320/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0148 - mae: 0.0950 - val_loss: 0.0144 - val_mae: 0.0973
Epoch 321/500
600/600 [==============================] - 0s 62us/sample - loss: 0.0147 - mae: 0.0954 - val_loss: 0.0152 - val_mae: 0.0983
Epoch 322/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0149 - mae: 0.0955 - val_loss: 0.0147 - val_mae: 0.0982
Epoch 323/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0149 - mae: 0.0956 - val_loss: 0.0145 - val_mae: 0.0977
Epoch 324/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0147 - mae: 0.0956 - val_loss: 0.0142 - val_mae: 0.0963
Epoch 325/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0145 - mae: 0.0950 - val_loss: 0.0144 - val_mae: 0.0974
Epoch 326/500
600/600 [==============================] - 0s 59us/sample - loss: 0.0147 - mae: 0.0957 - val_loss: 0.0141 - val_mae: 0.0965
Epoch 327/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0147 - mae: 0.0960 - val_loss: 0.0144 - val_mae: 0.0973
Epoch 328/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0145 - mae: 0.0944 - val_loss: 0.0141 - val_mae: 0.0959
Epoch 329/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0145 - mae: 0.0952 - val_loss: 0.0137 - val_mae: 0.0949
Epoch 330/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0143 - mae: 0.0944 - val_loss: 0.0139 - val_mae: 0.0952
Epoch 331/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0143 - mae: 0.0941 - val_loss: 0.0139 - val_mae: 0.0947
Epoch 332/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0141 - mae: 0.0941 - val_loss: 0.0139 - val_mae: 0.0949
Epoch 333/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0149 - mae: 0.0951 - val_loss: 0.0148 - val_mae: 0.0968
Epoch 334/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0148 - mae: 0.0957 - val_loss: 0.0151 - val_mae: 0.0979
Epoch 335/500
600/600 [==============================] - 0s 58us/sample - loss: 0.0151 - mae: 0.0966 - val_loss: 0.0139 - val_mae: 0.0945
Epoch 336/500
600/600 [==============================] - 0s 59us/sample - loss: 0.0141 - mae: 0.0932 - val_loss: 0.0140 - val_mae: 0.0954
Epoch 337/500
600/600 [==============================] - 0s 60us/sample - loss: 0.0141 - mae: 0.0936 - val_loss: 0.0133 - val_mae: 0.0934
Epoch 338/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0141 - mae: 0.0932 - val_loss: 0.0137 - val_mae: 0.0943
Epoch 339/500
600/600 [==============================] - 0s 62us/sample - loss: 0.0139 - mae: 0.0931 - val_loss: 0.0132 - val_mae: 0.0929
Epoch 340/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0136 - mae: 0.0923 - val_loss: 0.0132 - val_mae: 0.0929
Epoch 341/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0137 - mae: 0.0925 - val_loss: 0.0146 - val_mae: 0.0963
Epoch 342/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0140 - mae: 0.0947 - val_loss: 0.0139 - val_mae: 0.0946
Epoch 343/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0139 - mae: 0.0940 - val_loss: 0.0136 - val_mae: 0.0934
Epoch 344/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0135 - mae: 0.0920 - val_loss: 0.0132 - val_mae: 0.0925
Epoch 345/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0136 - mae: 0.0923 - val_loss: 0.0134 - val_mae: 0.0932
Epoch 346/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0134 - mae: 0.0922 - val_loss: 0.0130 - val_mae: 0.0919
Epoch 347/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0133 - mae: 0.0920 - val_loss: 0.0137 - val_mae: 0.0937
Epoch 348/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0134 - mae: 0.0926 - val_loss: 0.0133 - val_mae: 0.0926
Epoch 349/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0139 - mae: 0.0941 - val_loss: 0.0135 - val_mae: 0.0929
Epoch 350/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0133 - mae: 0.0904 - val_loss: 0.0126 - val_mae: 0.0907
Epoch 351/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0134 - mae: 0.0916 - val_loss: 0.0128 - val_mae: 0.0912
Epoch 352/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0137 - mae: 0.0928 - val_loss: 0.0131 - val_mae: 0.0916
Epoch 353/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0144 - mae: 0.0947 - val_loss: 0.0126 - val_mae: 0.0904
Epoch 354/500
600/600 [==============================] - 0s 58us/sample - loss: 0.0131 - mae: 0.0910 - val_loss: 0.0132 - val_mae: 0.0923
Epoch 355/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0138 - mae: 0.0930 - val_loss: 0.0131 - val_mae: 0.0919
Epoch 356/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0135 - mae: 0.0926 - val_loss: 0.0126 - val_mae: 0.0904
Epoch 357/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0131 - mae: 0.0907 - val_loss: 0.0138 - val_mae: 0.0940
Epoch 358/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0132 - mae: 0.0907 - val_loss: 0.0126 - val_mae: 0.0904
Epoch 359/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0129 - mae: 0.0903 - val_loss: 0.0127 - val_mae: 0.0907
Epoch 360/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0128 - mae: 0.0900 - val_loss: 0.0126 - val_mae: 0.0902
Epoch 361/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0133 - mae: 0.0909 - val_loss: 0.0126 - val_mae: 0.0905
Epoch 362/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0130 - mae: 0.0907 - val_loss: 0.0125 - val_mae: 0.0898
Epoch 363/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0129 - mae: 0.0899 - val_loss: 0.0124 - val_mae: 0.0896
Epoch 364/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0129 - mae: 0.0903 - val_loss: 0.0126 - val_mae: 0.0900
Epoch 365/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0129 - mae: 0.0898 - val_loss: 0.0125 - val_mae: 0.0901
Epoch 366/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0129 - mae: 0.0910 - val_loss: 0.0131 - val_mae: 0.0912
Epoch 367/500
600/600 [==============================] - 0s 72us/sample - loss: 0.0127 - mae: 0.0895 - val_loss: 0.0122 - val_mae: 0.0890
Epoch 368/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0129 - mae: 0.0905 - val_loss: 0.0126 - val_mae: 0.0905
Epoch 369/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0129 - mae: 0.0902 - val_loss: 0.0123 - val_mae: 0.0889
Epoch 370/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0127 - mae: 0.0899 - val_loss: 0.0125 - val_mae: 0.0894
Epoch 371/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0134 - mae: 0.0920 - val_loss: 0.0139 - val_mae: 0.0931
Epoch 372/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0134 - mae: 0.0916 - val_loss: 0.0129 - val_mae: 0.0905
Epoch 373/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0129 - mae: 0.0907 - val_loss: 0.0126 - val_mae: 0.0897
Epoch 374/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0128 - mae: 0.0899 - val_loss: 0.0121 - val_mae: 0.0879
Epoch 375/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0133 - mae: 0.0923 - val_loss: 0.0125 - val_mae: 0.0904
Epoch 376/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0129 - mae: 0.0908 - val_loss: 0.0130 - val_mae: 0.0915
Epoch 377/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0129 - mae: 0.0911 - val_loss: 0.0119 - val_mae: 0.0877
Epoch 378/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0138 - mae: 0.0941 - val_loss: 0.0121 - val_mae: 0.0881
Epoch 379/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0127 - mae: 0.0898 - val_loss: 0.0127 - val_mae: 0.0895
Epoch 380/500
600/600 [==============================] - 0s 46us/sample - loss: 0.0129 - mae: 0.0903 - val_loss: 0.0120 - val_mae: 0.0876
Epoch 381/500
600/600 [==============================] - 0s 45us/sample - loss: 0.0126 - mae: 0.0896 - val_loss: 0.0120 - val_mae: 0.0876
Epoch 382/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0130 - mae: 0.0917 - val_loss: 0.0121 - val_mae: 0.0880
Epoch 383/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0126 - mae: 0.0895 - val_loss: 0.0120 - val_mae: 0.0882
Epoch 384/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0128 - mae: 0.0910 - val_loss: 0.0150 - val_mae: 0.0983
Epoch 385/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0134 - mae: 0.0912 - val_loss: 0.0118 - val_mae: 0.0876
Epoch 386/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0124 - mae: 0.0892 - val_loss: 0.0123 - val_mae: 0.0886
Epoch 387/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0127 - mae: 0.0898 - val_loss: 0.0128 - val_mae: 0.0900
Epoch 388/500
600/600 [==============================] - 0s 58us/sample - loss: 0.0128 - mae: 0.0903 - val_loss: 0.0129 - val_mae: 0.0906
Epoch 389/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0148 - mae: 0.0984 - val_loss: 0.0121 - val_mae: 0.0880
Epoch 390/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0137 - mae: 0.0939 - val_loss: 0.0118 - val_mae: 0.0874
Epoch 391/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0127 - mae: 0.0896 - val_loss: 0.0122 - val_mae: 0.0893
Epoch 392/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0124 - mae: 0.0888 - val_loss: 0.0118 - val_mae: 0.0873
Epoch 393/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0124 - mae: 0.0887 - val_loss: 0.0119 - val_mae: 0.0879
Epoch 394/500
600/600 [==============================] - 0s 62us/sample - loss: 0.0124 - mae: 0.0885 - val_loss: 0.0117 - val_mae: 0.0865
Epoch 395/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0128 - mae: 0.0904 - val_loss: 0.0121 - val_mae: 0.0880
Epoch 396/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0126 - mae: 0.0895 - val_loss: 0.0119 - val_mae: 0.0874
Epoch 397/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0124 - mae: 0.0883 - val_loss: 0.0120 - val_mae: 0.0880
Epoch 398/500
600/600 [==============================] - 0s 58us/sample - loss: 0.0130 - mae: 0.0906 - val_loss: 0.0122 - val_mae: 0.0891
Epoch 399/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0136 - mae: 0.0935 - val_loss: 0.0128 - val_mae: 0.0917
Epoch 400/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0136 - mae: 0.0923 - val_loss: 0.0128 - val_mae: 0.0910
Epoch 401/500
600/600 [==============================] - 0s 59us/sample - loss: 0.0126 - mae: 0.0896 - val_loss: 0.0134 - val_mae: 0.0934
Epoch 402/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0135 - mae: 0.0925 - val_loss: 0.0127 - val_mae: 0.0910
Epoch 403/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0129 - mae: 0.0904 - val_loss: 0.0117 - val_mae: 0.0868
Epoch 404/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0126 - mae: 0.0898 - val_loss: 0.0140 - val_mae: 0.0928
Epoch 405/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0132 - mae: 0.0928 - val_loss: 0.0117 - val_mae: 0.0869
Epoch 406/500
600/600 [==============================] - 0s 47us/sample - loss: 0.0126 - mae: 0.0906 - val_loss: 0.0128 - val_mae: 0.0908
Epoch 407/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0122 - mae: 0.0880 - val_loss: 0.0117 - val_mae: 0.0870
Epoch 408/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0125 - mae: 0.0897 - val_loss: 0.0119 - val_mae: 0.0875
Epoch 409/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0124 - mae: 0.0889 - val_loss: 0.0118 - val_mae: 0.0869
Epoch 410/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0124 - mae: 0.0888 - val_loss: 0.0117 - val_mae: 0.0868
Epoch 411/500
600/600 [==============================] - 0s 59us/sample - loss: 0.0122 - mae: 0.0886 - val_loss: 0.0139 - val_mae: 0.0933
Epoch 412/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0132 - mae: 0.0923 - val_loss: 0.0125 - val_mae: 0.0891
Epoch 413/500
600/600 [==============================] - 0s 62us/sample - loss: 0.0140 - mae: 0.0938 - val_loss: 0.0119 - val_mae: 0.0875
Epoch 414/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0134 - mae: 0.0917 - val_loss: 0.0125 - val_mae: 0.0897
Epoch 415/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0131 - mae: 0.0917 - val_loss: 0.0126 - val_mae: 0.0904
Epoch 416/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0128 - mae: 0.0900 - val_loss: 0.0129 - val_mae: 0.0912
Epoch 417/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0124 - mae: 0.0890 - val_loss: 0.0118 - val_mae: 0.0874
Epoch 418/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0128 - mae: 0.0899 - val_loss: 0.0132 - val_mae: 0.0925
Epoch 419/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0131 - mae: 0.0917 - val_loss: 0.0120 - val_mae: 0.0882
Epoch 420/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0124 - mae: 0.0884 - val_loss: 0.0130 - val_mae: 0.0919
Epoch 421/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0131 - mae: 0.0914 - val_loss: 0.0130 - val_mae: 0.0916
Epoch 422/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0133 - mae: 0.0921 - val_loss: 0.0115 - val_mae: 0.0864
Epoch 423/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0123 - mae: 0.0886 - val_loss: 0.0120 - val_mae: 0.0876
Epoch 424/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0122 - mae: 0.0883 - val_loss: 0.0141 - val_mae: 0.0935
Epoch 425/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0136 - mae: 0.0936 - val_loss: 0.0117 - val_mae: 0.0869
Epoch 426/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0134 - mae: 0.0922 - val_loss: 0.0116 - val_mae: 0.0868
Epoch 427/500
600/600 [==============================] - 0s 66us/sample - loss: 0.0121 - mae: 0.0879 - val_loss: 0.0116 - val_mae: 0.0867
Epoch 428/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0121 - mae: 0.0882 - val_loss: 0.0121 - val_mae: 0.0881
Epoch 429/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0125 - mae: 0.0895 - val_loss: 0.0114 - val_mae: 0.0859
Epoch 430/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0123 - mae: 0.0883 - val_loss: 0.0129 - val_mae: 0.0901
Epoch 431/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0126 - mae: 0.0900 - val_loss: 0.0120 - val_mae: 0.0877
Epoch 432/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0123 - mae: 0.0882 - val_loss: 0.0118 - val_mae: 0.0870
Epoch 433/500
600/600 [==============================] - 0s 60us/sample - loss: 0.0120 - mae: 0.0879 - val_loss: 0.0120 - val_mae: 0.0878
Epoch 434/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0122 - mae: 0.0877 - val_loss: 0.0114 - val_mae: 0.0861
Epoch 435/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0120 - mae: 0.0877 - val_loss: 0.0120 - val_mae: 0.0876
Epoch 436/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0122 - mae: 0.0885 - val_loss: 0.0115 - val_mae: 0.0862
Epoch 437/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0120 - mae: 0.0882 - val_loss: 0.0117 - val_mae: 0.0867
Epoch 438/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0119 - mae: 0.0872 - val_loss: 0.0116 - val_mae: 0.0865
Epoch 439/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0122 - mae: 0.0885 - val_loss: 0.0116 - val_mae: 0.0864
Epoch 440/500
600/600 [==============================] - 0s 65us/sample - loss: 0.0122 - mae: 0.0888 - val_loss: 0.0123 - val_mae: 0.0889
Epoch 441/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0120 - mae: 0.0886 - val_loss: 0.0116 - val_mae: 0.0864
Epoch 442/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0124 - mae: 0.0880 - val_loss: 0.0120 - val_mae: 0.0880
Epoch 443/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0121 - mae: 0.0875 - val_loss: 0.0123 - val_mae: 0.0885
Epoch 444/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0124 - mae: 0.0895 - val_loss: 0.0118 - val_mae: 0.0875
Epoch 445/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0126 - mae: 0.0902 - val_loss: 0.0117 - val_mae: 0.0869
Epoch 446/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0121 - mae: 0.0873 - val_loss: 0.0132 - val_mae: 0.0925
Epoch 447/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0124 - mae: 0.0883 - val_loss: 0.0124 - val_mae: 0.0890
Epoch 448/500
600/600 [==============================] - 0s 69us/sample - loss: 0.0120 - mae: 0.0877 - val_loss: 0.0115 - val_mae: 0.0863
Epoch 449/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0122 - mae: 0.0885 - val_loss: 0.0115 - val_mae: 0.0865
Epoch 450/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0125 - mae: 0.0904 - val_loss: 0.0118 - val_mae: 0.0872
Epoch 451/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0119 - mae: 0.0869 - val_loss: 0.0126 - val_mae: 0.0895
Epoch 452/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0124 - mae: 0.0890 - val_loss: 0.0116 - val_mae: 0.0867
Epoch 453/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0119 - mae: 0.0872 - val_loss: 0.0117 - val_mae: 0.0868
Epoch 454/500
600/600 [==============================] - 0s 49us/sample - loss: 0.0120 - mae: 0.0878 - val_loss: 0.0116 - val_mae: 0.0863
Epoch 455/500
600/600 [==============================] - 0s 61us/sample - loss: 0.0120 - mae: 0.0878 - val_loss: 0.0117 - val_mae: 0.0870
Epoch 456/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0118 - mae: 0.0869 - val_loss: 0.0115 - val_mae: 0.0862
Epoch 457/500
600/600 [==============================] - 0s 66us/sample - loss: 0.0121 - mae: 0.0883 - val_loss: 0.0116 - val_mae: 0.0866
Epoch 458/500
600/600 [==============================] - 0s 60us/sample - loss: 0.0121 - mae: 0.0876 - val_loss: 0.0116 - val_mae: 0.0863
Epoch 459/500
600/600 [==============================] - 0s 60us/sample - loss: 0.0119 - mae: 0.0872 - val_loss: 0.0116 - val_mae: 0.0864
Epoch 460/500
600/600 [==============================] - 0s 48us/sample - loss: 0.0119 - mae: 0.0871 - val_loss: 0.0115 - val_mae: 0.0862
Epoch 461/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0120 - mae: 0.0880 - val_loss: 0.0120 - val_mae: 0.0881
Epoch 462/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0119 - mae: 0.0872 - val_loss: 0.0116 - val_mae: 0.0864
Epoch 463/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0119 - mae: 0.0873 - val_loss: 0.0117 - val_mae: 0.0866
Epoch 464/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0118 - mae: 0.0868 - val_loss: 0.0115 - val_mae: 0.0862
Epoch 465/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0120 - mae: 0.0875 - val_loss: 0.0124 - val_mae: 0.0896
Epoch 466/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0117 - mae: 0.0875 - val_loss: 0.0129 - val_mae: 0.0901
Epoch 467/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0126 - mae: 0.0907 - val_loss: 0.0127 - val_mae: 0.0898
Epoch 468/500
600/600 [==============================] - 0s 58us/sample - loss: 0.0125 - mae: 0.0893 - val_loss: 0.0118 - val_mae: 0.0874
Epoch 469/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0122 - mae: 0.0887 - val_loss: 0.0115 - val_mae: 0.0864
Epoch 470/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0119 - mae: 0.0874 - val_loss: 0.0119 - val_mae: 0.0876
Epoch 471/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0118 - mae: 0.0866 - val_loss: 0.0116 - val_mae: 0.0867
Epoch 472/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0120 - mae: 0.0873 - val_loss: 0.0118 - val_mae: 0.0872
Epoch 473/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0121 - mae: 0.0882 - val_loss: 0.0115 - val_mae: 0.0863
Epoch 474/500
600/600 [==============================] - 0s 55us/sample - loss: 0.0118 - mae: 0.0871 - val_loss: 0.0117 - val_mae: 0.0867
Epoch 475/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0120 - mae: 0.0877 - val_loss: 0.0121 - val_mae: 0.0884
Epoch 476/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0127 - mae: 0.0902 - val_loss: 0.0119 - val_mae: 0.0877
Epoch 477/500
600/600 [==============================] - 0s 61us/sample - loss: 0.0122 - mae: 0.0882 - val_loss: 0.0151 - val_mae: 0.0967
Epoch 478/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0136 - mae: 0.0933 - val_loss: 0.0123 - val_mae: 0.0889
Epoch 479/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0121 - mae: 0.0884 - val_loss: 0.0116 - val_mae: 0.0869
Epoch 480/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0121 - mae: 0.0883 - val_loss: 0.0118 - val_mae: 0.0877
Epoch 481/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0120 - mae: 0.0876 - val_loss: 0.0118 - val_mae: 0.0875
Epoch 482/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0121 - mae: 0.0887 - val_loss: 0.0116 - val_mae: 0.0865
Epoch 483/500
600/600 [==============================] - 0s 70us/sample - loss: 0.0122 - mae: 0.0892 - val_loss: 0.0114 - val_mae: 0.0863
Epoch 484/500
600/600 [==============================] - 0s 57us/sample - loss: 0.0132 - mae: 0.0926 - val_loss: 0.0115 - val_mae: 0.0866
Epoch 485/500
600/600 [==============================] - 0s 70us/sample - loss: 0.0138 - mae: 0.0948 - val_loss: 0.0118 - val_mae: 0.0874
Epoch 486/500
600/600 [==============================] - 0s 59us/sample - loss: 0.0119 - mae: 0.0879 - val_loss: 0.0114 - val_mae: 0.0860
Epoch 487/500
600/600 [==============================] - 0s 50us/sample - loss: 0.0118 - mae: 0.0872 - val_loss: 0.0116 - val_mae: 0.0870
Epoch 488/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0117 - mae: 0.0870 - val_loss: 0.0114 - val_mae: 0.0861
Epoch 489/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0118 - mae: 0.0869 - val_loss: 0.0120 - val_mae: 0.0879
Epoch 490/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0119 - mae: 0.0873 - val_loss: 0.0115 - val_mae: 0.0863
Epoch 491/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0118 - mae: 0.0871 - val_loss: 0.0117 - val_mae: 0.0873
Epoch 492/500
600/600 [==============================] - 0s 61us/sample - loss: 0.0122 - mae: 0.0886 - val_loss: 0.0127 - val_mae: 0.0899
Epoch 493/500
600/600 [==============================] - 0s 54us/sample - loss: 0.0122 - mae: 0.0881 - val_loss: 0.0113 - val_mae: 0.0857
Epoch 494/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0125 - mae: 0.0898 - val_loss: 0.0119 - val_mae: 0.0880
Epoch 495/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0123 - mae: 0.0897 - val_loss: 0.0116 - val_mae: 0.0866
Epoch 496/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0119 - mae: 0.0875 - val_loss: 0.0115 - val_mae: 0.0866
Epoch 497/500
600/600 [==============================] - 0s 56us/sample - loss: 0.0118 - mae: 0.0868 - val_loss: 0.0117 - val_mae: 0.0871
Epoch 498/500
600/600 [==============================] - 0s 52us/sample - loss: 0.0124 - mae: 0.0889 - val_loss: 0.0116 - val_mae: 0.0866
Epoch 499/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0119 - mae: 0.0871 - val_loss: 0.0115 - val_mae: 0.0863
Epoch 500/500
600/600 [==============================] - 0s 53us/sample - loss: 0.0118 - mae: 0.0873 - val_loss: 0.0115 - val_mae: 0.0864

3. Plot Metrics

Each training epoch, the model prints out its loss and mean absolute error for training and validation. You can read this in the output above (note that your exact numbers may differ):

Epoch 500/500
600/600 [==============================] - 0s 51us/sample - loss: 0.0118 - mae: 0.0873 - val_loss: 0.0105 - val_mae: 0.0832

You can see that we've already got a huge improvement - validation loss has dropped from 0.15 to 0.01, and validation MAE has dropped from 0.33 to 0.08.

The following cell will print the same graphs we used to evaluate our original model, but showing our new training history:


In [42]:
# Draw a graph of the loss, which is the distance between
# the predicted and actual values during training and validation.
loss = history_2.history['loss']
val_loss = history_2.history['val_loss']

epochs = range(1, len(loss) + 1)

# Exclude the first few epochs so the graph is easier to read
SKIP = 100

plt.figure(figsize=(10, 4))
plt.subplot(1, 2, 1)

plt.plot(epochs[SKIP:], loss[SKIP:], 'g.', label='Training loss')
plt.plot(epochs[SKIP:], val_loss[SKIP:], 'b.', label='Validation loss')
plt.title('Training and validation loss')
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.legend()

plt.subplot(1, 2, 2)

# Draw a graph of mean absolute error, which is another way of
# measuring the amount of error in the prediction.
mae = history_2.history['mae']
val_mae = history_2.history['val_mae']

plt.plot(epochs[SKIP:], mae[SKIP:], 'g.', label='Training MAE')
plt.plot(epochs[SKIP:], val_mae[SKIP:], 'b.', label='Validation MAE')
plt.title('Training and validation mean absolute error')
plt.xlabel('Epochs')
plt.ylabel('MAE')
plt.legend()

plt.tight_layout()


Great results! From these graphs, we can see several exciting things:

  • The overall loss and MAE are much better than our previous network
  • Metrics are better for validation than training, which means the network is not overfitting

The reason the metrics for validation are better than those for training is that validation metrics are calculated at the end of each epoch, while training metrics are calculated throughout the epoch, so validation happens on a model that has been trained slightly longer.

This all means our network seems to be performing well! To confirm, let's check its predictions against the test dataset we set aside earlier:


In [17]:
# Calculate and print the loss on our test dataset
loss = model_2.evaluate(x_test, y_test)

# Make predictions based on our test dataset
predictions = model_2.predict(x_test)

# Graph the predictions against the actual values
plt.clf()
plt.title('Comparison of predictions and actual values')
plt.plot(x_test, y_test, 'b.', label='Actual')
plt.plot(x_test, predictions, 'r.', label='Predicted')
plt.legend()
plt.show()


200/1 [================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 40us/sample - loss: 0.0082 - mae: 0.0827

Much better! The evaluation metrics we printed show that the model has a low loss and MAE on the test data, and the predictions line up visually with our data fairly well.

The model isn't perfect; its predictions don't form a smooth sine curve. For instance, the line is almost straight when x is between 4.2 and 5.2. If we wanted to go further, we could try further increasing the capacity of the model, perhaps using some techniques to defend from overfitting.

However, an important part of machine learning is knowing when to quit, and this model is good enough for our use case - which is to make some LEDs blink in a pleasing pattern.

Generate a TensorFlow Lite Model

1. Generate Models with or without Quantization

We now have an acceptably accurate model. We'll use the TensorFlow Lite Converter to convert the model into a special, space-efficient format for use on memory-constrained devices.

Since this model is going to be deployed on a microcontroller, we want it to be as tiny as possible! One technique for reducing the size of models is called quantization while converting the model. It reduces the precision of the model's weights, and possibly the activations (output of each layer) as well, which saves memory, often without much impact on accuracy. Quantized models also run faster, since the calculations required are simpler.

Note: Currently, TFLite Converter produces TFlite models with float interfaces (input and output ops are always float). This is a blocker for users who require TFlite models with pure int8 or uint8 inputs/outputs. Refer to https://github.com/tensorflow/tensorflow/issues/38285

In the following cell, we'll convert the model twice: once with quantization, once without.


In [18]:
# Convert the model to the TensorFlow Lite format without quantization
converter = tf.lite.TFLiteConverter.from_keras_model(model_2)
model_no_quant_tflite = converter.convert()

# # Save the model to disk
open(MODEL_NO_QUANT_TFLITE, "wb").write(model_no_quant_tflite)

# Convert the model to the TensorFlow Lite format with quantization
def representative_dataset():
  for i in range(500):
    yield([x_train[i].reshape(1, 1)])
# Set the optimization flag.
converter.optimizations = [tf.lite.Optimize.DEFAULT]
# Enforce full-int8 quantization (except inputs/outputs which are always float)
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
# Provide a representative dataset to ensure we quantize correctly.
converter.representative_dataset = representative_dataset
model_tflite = converter.convert()

# Save the model to disk
open(MODEL_TFLITE, "wb").write(model_tflite)


Out[18]:
2512

2. Compare Model Sizes


In [19]:
import os
model_no_quant_size = os.path.getsize(MODEL_NO_QUANT_TFLITE)
print("Model is %d bytes" % model_no_quant_size)
model_size = os.path.getsize(MODEL_TFLITE)
print("Quantized model is %d bytes" % model_size)
difference = model_no_quant_size - model_size
print("Difference is %d bytes" % difference)


Model is 2736 bytes
Quantized model is 2512 bytes
Difference is 224 bytes

Our quantized model is only 224 bytes smaller than the original version, which only a tiny reduction in size! At around 2.5 kilobytes, this model is already so small that the weights make up only a small fraction of the overall size, meaning quantization has little effect.

More complex models have many more weights, meaning the space saving from quantization will be much higher, approaching 4x for most sophisticated models.

Regardless, our quantized model will take less time to execute than the original version, which is important on a tiny microcontroller!

3. Test the Models

To prove these models are still accurate after conversion and quantization, we'll use both of them to make predictions and compare these against our test results:


In [20]:
# Instantiate an interpreter for each model
model_no_quant = tf.lite.Interpreter(MODEL_NO_QUANT_TFLITE)
model = tf.lite.Interpreter(MODEL_TFLITE)

# Allocate memory for each model
model_no_quant.allocate_tensors()
model.allocate_tensors()

# Get the input and output tensors so we can feed in values and get the results
model_no_quant_input = model_no_quant.tensor(model_no_quant.get_input_details()[0]["index"])
model_no_quant_output = model_no_quant.tensor(model_no_quant.get_output_details()[0]["index"])
model_input = model.tensor(model.get_input_details()[0]["index"])
model_output = model.tensor(model.get_output_details()[0]["index"])

# Create arrays to store the results
model_no_quant_predictions = np.empty(x_test.size)
model_predictions = np.empty(x_test.size)

# Run each model's interpreter for each value and store the results in arrays
for i in range(x_test.size):
  model_no_quant_input().fill(x_test[i])
  model_no_quant.invoke()
  model_no_quant_predictions[i] = model_no_quant_output()[0]

  model_input().fill(x_test[i])
  model.invoke()
  model_predictions[i] = model_output()[0]

# See how they line up with the data
plt.clf()
plt.title('Comparison of various models against actual values')
plt.plot(x_test, y_test, 'bo', label='Actual values')
plt.plot(x_test, predictions, 'ro', label='Original predictions')
plt.plot(x_test, model_no_quant_predictions, 'bx', label='Lite predictions')
plt.plot(x_test, model_predictions, 'gx', label='Lite quantized predictions')
plt.legend()
plt.show()


We can see from the graph that the predictions for the original model, the converted model, and the quantized model are all close enough to be indistinguishable. This means that our quantized model is ready to use!

Generate a TensorFlow Lite for Microcontrollers Model

Convert the TensorFlow Lite quantized model into a C source file that can be loaded by TensorFlow Lite for Microcontrollers.


In [21]:
# Install xxd if it is not available
!apt-get update && apt-get -qq install xxd
# Convert to a C source file
!xxd -i {MODEL_TFLITE} > {MODEL_TFLITE_MICRO}
# Update variable names
REPLACE_TEXT = MODEL_TFLITE.replace('/', '_').replace('.', '_')
!sed -i 's/'{REPLACE_TEXT}'/g_model/g' {MODEL_TFLITE_MICRO}


Get:1 https://cloud.r-project.org/bin/linux/ubuntu bionic-cran35/ InRelease [3,626 B]
Ign:2 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64  InRelease
Hit:3 http://archive.ubuntu.com/ubuntu bionic InRelease
Get:4 http://security.ubuntu.com/ubuntu bionic-security InRelease [88.7 kB]
Hit:5 http://ppa.launchpad.net/graphics-drivers/ppa/ubuntu bionic InRelease
Ign:6 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64  InRelease
Hit:7 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64  Release
Hit:8 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64  Release
Get:9 http://archive.ubuntu.com/ubuntu bionic-updates InRelease [88.7 kB]
Get:10 http://ppa.launchpad.net/marutter/c2d4u3.5/ubuntu bionic InRelease [15.4 kB]
Get:11 http://archive.ubuntu.com/ubuntu bionic-backports InRelease [74.6 kB]
Get:14 http://ppa.launchpad.net/marutter/c2d4u3.5/ubuntu bionic/main Sources [1,810 kB]
Get:15 http://security.ubuntu.com/ubuntu bionic-security/restricted amd64 Packages [38.5 kB]
Get:16 http://security.ubuntu.com/ubuntu bionic-security/main amd64 Packages [873 kB]
Get:17 http://archive.ubuntu.com/ubuntu bionic-updates/universe amd64 Packages [1,368 kB]
Get:18 http://security.ubuntu.com/ubuntu bionic-security/universe amd64 Packages [835 kB]
Get:19 http://archive.ubuntu.com/ubuntu bionic-updates/restricted amd64 Packages [57.5 kB]
Get:20 http://archive.ubuntu.com/ubuntu bionic-updates/main amd64 Packages [1,176 kB]
Get:21 http://ppa.launchpad.net/marutter/c2d4u3.5/ubuntu bionic/main amd64 Packages [873 kB]
Fetched 7,301 kB in 3s (2,475 kB/s)
Reading package lists... Done
Selecting previously unselected package xxd.
(Reading database ... 144568 files and directories currently installed.)
Preparing to unpack .../xxd_2%3a8.0.1453-1ubuntu1.3_amd64.deb ...
Unpacking xxd (2:8.0.1453-1ubuntu1.3) ...
Setting up xxd (2:8.0.1453-1ubuntu1.3) ...
Processing triggers for man-db (2.8.3-2ubuntu0.1) ...

Deploy to a Microcontroller

Follow the instructions in the hello_world README.md for TensorFlow Lite for MicroControllers to deploy this model on a specific microcontroller.

Reference Model: If you have not modified this notebook, you can follow the instructions as is, to deploy the model. Refer to the hello_world/train/models directory to access the models generated in this notebook.

New Model: If you have generated a new model, then update the values assigned to the variables defined in hello_world/model.cc with values displayed after running the following cell.


In [22]:
# Print the C source file
!cat {MODEL_TFLITE_MICRO}


unsigned char g_model[] = {
  0x1c, 0x00, 0x00, 0x00, 0x54, 0x46, 0x4c, 0x33, 0x00, 0x00, 0x12, 0x00,
  0x1c, 0x00, 0x04, 0x00, 0x08, 0x00, 0x0c, 0x00, 0x10, 0x00, 0x14, 0x00,
  0x00, 0x00, 0x18, 0x00, 0x12, 0x00, 0x00, 0x00, 0x03, 0x00, 0x00, 0x00,
  0x60, 0x09, 0x00, 0x00, 0xa8, 0x02, 0x00, 0x00, 0x90, 0x02, 0x00, 0x00,
  0x3c, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x0c, 0x00, 0x00, 0x00, 0x08, 0x00, 0x0c, 0x00, 0x04, 0x00, 0x08, 0x00,
  0x08, 0x00, 0x00, 0x00, 0x08, 0x00, 0x00, 0x00, 0x0b, 0x00, 0x00, 0x00,
  0x13, 0x00, 0x00, 0x00, 0x6d, 0x69, 0x6e, 0x5f, 0x72, 0x75, 0x6e, 0x74,
  0x69, 0x6d, 0x65, 0x5f, 0x76, 0x65, 0x72, 0x73, 0x69, 0x6f, 0x6e, 0x00,
  0x0c, 0x00, 0x00, 0x00, 0x48, 0x02, 0x00, 0x00, 0x34, 0x02, 0x00, 0x00,
  0x0c, 0x02, 0x00, 0x00, 0xfc, 0x00, 0x00, 0x00, 0xac, 0x00, 0x00, 0x00,
  0x8c, 0x00, 0x00, 0x00, 0x3c, 0x00, 0x00, 0x00, 0x34, 0x00, 0x00, 0x00,
  0x2c, 0x00, 0x00, 0x00, 0x24, 0x00, 0x00, 0x00, 0x1c, 0x00, 0x00, 0x00,
  0x04, 0x00, 0x00, 0x00, 0xfe, 0xfd, 0xff, 0xff, 0x04, 0x00, 0x00, 0x00,
  0x05, 0x00, 0x00, 0x00, 0x31, 0x2e, 0x35, 0x2e, 0x30, 0x00, 0x00, 0x00,
  0x7c, 0xfd, 0xff, 0xff, 0x80, 0xfd, 0xff, 0xff, 0x84, 0xfd, 0xff, 0xff,
  0x88, 0xfd, 0xff, 0xff, 0x22, 0xfe, 0xff, 0xff, 0x04, 0x00, 0x00, 0x00,
  0x40, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0xfc, 0x04, 0x00, 0x00,
  0x9f, 0x0a, 0x00, 0x00, 0x65, 0x06, 0x00, 0x00, 0x3d, 0xf8, 0xff, 0xff,
  0x00, 0x00, 0x00, 0x00, 0xeb, 0x0a, 0x00, 0x00, 0x2f, 0xf8, 0xff, 0xff,
  0xe8, 0x04, 0x00, 0x00, 0x21, 0x0a, 0x00, 0x00, 0x46, 0xfe, 0xff, 0xff,
  0xc8, 0x04, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0xa3, 0xf7, 0xff, 0xff,
  0x28, 0xf9, 0xff, 0xff, 0x9a, 0x05, 0x00, 0x00, 0x6e, 0xfe, 0xff, 0xff,
  0x04, 0x00, 0x00, 0x00, 0x10, 0x00, 0x00, 0x00, 0x73, 0x1c, 0x11, 0xe1,
  0x0c, 0x81, 0xa5, 0x43, 0xfe, 0xd5, 0xd5, 0xb2, 0x60, 0x77, 0x19, 0xdf,
  0x8a, 0xfe, 0xff, 0xff, 0x04, 0x00, 0x00, 0x00, 0x40, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0x51, 0x0b, 0x00, 0x00, 0x47, 0xf6, 0xff, 0xff,
  0x00, 0x00, 0x00, 0x00, 0x1c, 0x0c, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x00, 0x00, 0x9b, 0x05, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x00, 0x00, 0xe7, 0x20, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
  0x92, 0x07, 0x00, 0x00, 0xf4, 0xf4, 0xff, 0xff, 0x55, 0xf0, 0xff, 0xff,
  0x00, 0x00, 0x00, 0x00, 0xd6, 0xfe, 0xff, 0xff, 0x04, 0x00, 0x00, 0x00,
  0x00, 0x01, 0x00, 0x00, 0xee, 0xfc, 0x00, 0xec, 0x05, 0x16, 0xef, 0xec,
  0xe6, 0xf8, 0x03, 0x01, 0x00, 0xfa, 0xf8, 0xf5, 0xda, 0xeb, 0x27, 0x14,
  0xef, 0xde, 0xe2, 0xda, 0xf0, 0xdf, 0x32, 0x06, 0x01, 0xe6, 0xee, 0xf9,
  0x00, 0x16, 0x07, 0xe0, 0xfe, 0xff, 0xe9, 0x05, 0xe7, 0xef, 0x81, 0x1b,
  0x18, 0xea, 0xca, 0x01, 0x0f, 0x00, 0xdb, 0xf7, 0x0e, 0xec, 0x12, 0x1e,
  0x04, 0x13, 0xb2, 0xe7, 0xfd, 0x06, 0xbb, 0xe0, 0x0c, 0xec, 0xf0, 0xdf,
  0xeb, 0xf7, 0x05, 0x26, 0x19, 0xe4, 0x70, 0x1a, 0xea, 0x1e, 0x34, 0xdf,
  0x19, 0xf3, 0xf1, 0x19, 0x0e, 0x03, 0x1b, 0xe1, 0xde, 0x13, 0xf6, 0x19,
  0xff, 0xf6, 0x1a, 0x17, 0xf1, 0x1c, 0xdb, 0x1a, 0x1a, 0x20, 0xe6, 0x19,
  0xf5, 0xff, 0x97, 0x0b, 0x00, 0x00, 0xce, 0xdf, 0x0d, 0xf7, 0x15, 0xe4,
  0xed, 0xfc, 0x0d, 0xe9, 0xfb, 0xec, 0x5c, 0xfc, 0x1d, 0x02, 0x58, 0xe3,
  0xe0, 0xf4, 0x15, 0xec, 0xf9, 0x00, 0x13, 0x05, 0xec, 0x0c, 0x1c, 0x14,
  0x0c, 0xe9, 0x0a, 0xf4, 0x18, 0x00, 0xd7, 0x05, 0x27, 0x02, 0x15, 0xea,
  0xea, 0x02, 0x9b, 0x00, 0x0c, 0xfa, 0xe9, 0xea, 0xfe, 0x01, 0x14, 0xfd,
  0x0b, 0x02, 0xf0, 0xef, 0x06, 0xee, 0x01, 0x0d, 0x06, 0xe7, 0xf7, 0x11,
  0xf5, 0x0a, 0xf9, 0xf1, 0x23, 0xff, 0x0d, 0xf2, 0xec, 0x11, 0x26, 0x1d,
  0xf2, 0xea, 0x28, 0x18, 0xe0, 0xfb, 0xf3, 0xf4, 0x05, 0x1c, 0x1d, 0xfb,
  0xfd, 0x1e, 0xfc, 0x11, 0xe8, 0x06, 0x09, 0x03, 0x12, 0xf2, 0x35, 0xfb,
  0xdd, 0x1b, 0xf9, 0xef, 0xf3, 0xe7, 0x6f, 0x0c, 0x1d, 0x00, 0x43, 0xfd,
  0x0d, 0xf1, 0x0a, 0x19, 0x1a, 0xfa, 0xe0, 0x18, 0x1e, 0x13, 0x37, 0x1c,
  0x12, 0xec, 0x3a, 0x0c, 0xb6, 0xcb, 0xe6, 0x13, 0xf7, 0xeb, 0xf1, 0x05,
  0x1b, 0xfa, 0x19, 0xe5, 0xec, 0xcf, 0x0c, 0xf4, 0xe2, 0xff, 0xff, 0xff,
  0x04, 0x00, 0x00, 0x00, 0x10, 0x00, 0x00, 0x00, 0x21, 0xa2, 0x8c, 0xc9,
  0x5f, 0x1d, 0xce, 0x41, 0x9f, 0xcd, 0x20, 0xb1, 0xdf, 0x53, 0x2f, 0x81,
  0x00, 0x00, 0x06, 0x00, 0x08, 0x00, 0x04, 0x00, 0x06, 0x00, 0x00, 0x00,
  0x04, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0xe2, 0xee, 0xff, 0xff,
  0x80, 0xff, 0xff, 0xff, 0x0f, 0x00, 0x00, 0x00, 0x54, 0x4f, 0x43, 0x4f,
  0x20, 0x43, 0x6f, 0x6e, 0x76, 0x65, 0x72, 0x74, 0x65, 0x64, 0x2e, 0x00,
  0x01, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0xbc, 0xf9, 0xff, 0xff,
  0x48, 0x01, 0x00, 0x00, 0x3c, 0x01, 0x00, 0x00, 0x30, 0x01, 0x00, 0x00,
  0x04, 0x00, 0x00, 0x00, 0x05, 0x00, 0x00, 0x00, 0x04, 0x01, 0x00, 0x00,
  0xb8, 0x00, 0x00, 0x00, 0x70, 0x00, 0x00, 0x00, 0x28, 0x00, 0x00, 0x00,
  0x04, 0x00, 0x00, 0x00, 0x1a, 0xff, 0xff, 0xff, 0x02, 0x00, 0x00, 0x00,
  0x10, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x0b, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
  0xca, 0xff, 0xff, 0xff, 0x00, 0x00, 0x00, 0x08, 0x1c, 0x00, 0x00, 0x00,
  0x10, 0x00, 0x00, 0x00, 0x08, 0x00, 0x00, 0x00, 0x04, 0x00, 0x04, 0x00,
  0x04, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
  0x03, 0x00, 0x00, 0x00, 0x07, 0x00, 0x00, 0x00, 0x08, 0x00, 0x00, 0x00,
  0x09, 0x00, 0x00, 0x00, 0x00, 0x00, 0x0e, 0x00, 0x14, 0x00, 0x00, 0x00,
  0x08, 0x00, 0x0c, 0x00, 0x07, 0x00, 0x10, 0x00, 0x0e, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x00, 0x08, 0x1c, 0x00, 0x00, 0x00, 0x10, 0x00, 0x00, 0x00,
  0x04, 0x00, 0x00, 0x00, 0xba, 0xff, 0xff, 0xff, 0x00, 0x00, 0x00, 0x01,
  0x01, 0x00, 0x00, 0x00, 0x07, 0x00, 0x00, 0x00, 0x03, 0x00, 0x00, 0x00,
  0x04, 0x00, 0x00, 0x00, 0x05, 0x00, 0x00, 0x00, 0x06, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x0e, 0x00, 0x16, 0x00, 0x00, 0x00, 0x08, 0x00, 0x0c, 0x00,
  0x07, 0x00, 0x10, 0x00, 0x0e, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x08,
  0x24, 0x00, 0x00, 0x00, 0x18, 0x00, 0x00, 0x00, 0x0c, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x06, 0x00, 0x08, 0x00, 0x07, 0x00, 0x06, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x00, 0x01, 0x01, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00,
  0x03, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0x02, 0x00, 0x00, 0x00,
  0x03, 0x00, 0x00, 0x00, 0x00, 0x00, 0x0a, 0x00, 0x10, 0x00, 0x04, 0x00,
  0x08, 0x00, 0x0c, 0x00, 0x0a, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x10, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0x0a, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0x0b, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x0a, 0x00, 0x00, 0x00, 0x0c, 0x00, 0x00, 0x00, 0xdc, 0x04, 0x00, 0x00,
  0x54, 0x04, 0x00, 0x00, 0xc4, 0x03, 0x00, 0x00, 0x54, 0x03, 0x00, 0x00,
  0xd0, 0x02, 0x00, 0x00, 0x4c, 0x02, 0x00, 0x00, 0xe0, 0x01, 0x00, 0x00,
  0x5c, 0x01, 0x00, 0x00, 0xd8, 0x00, 0x00, 0x00, 0x6c, 0x00, 0x00, 0x00,
  0x3c, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0xd8, 0xff, 0xff, 0xff,
  0x18, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x08, 0x00, 0x00, 0x00,
  0x49, 0x64, 0x65, 0x6e, 0x74, 0x69, 0x74, 0x79, 0x00, 0x00, 0x00, 0x00,
  0x02, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x0c, 0x00, 0x0c, 0x00, 0x04, 0x00, 0x00, 0x00, 0x00, 0x00, 0x08, 0x00,
  0x0c, 0x00, 0x00, 0x00, 0x1c, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00,
  0x0d, 0x00, 0x00, 0x00, 0x64, 0x65, 0x6e, 0x73, 0x65, 0x5f, 0x32, 0x5f,
  0x69, 0x6e, 0x70, 0x75, 0x74, 0x00, 0x00, 0x00, 0x02, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0xc2, 0xfb, 0xff, 0xff,
  0x00, 0x00, 0x00, 0x02, 0x58, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x28, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0xc4, 0xfc, 0xff, 0xff,
  0x14, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0xba, 0x2b, 0x4f, 0x38, 0x20, 0x00, 0x00, 0x00, 0x73, 0x65, 0x71, 0x75,
  0x65, 0x6e, 0x74, 0x69, 0x61, 0x6c, 0x5f, 0x31, 0x2f, 0x64, 0x65, 0x6e,
  0x73, 0x65, 0x5f, 0x34, 0x2f, 0x4d, 0x61, 0x74, 0x4d, 0x75, 0x6c, 0x5f,
  0x62, 0x69, 0x61, 0x73, 0x00, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0x2a, 0xfc, 0xff, 0xff, 0x00, 0x00, 0x00, 0x09,
  0x6c, 0x00, 0x00, 0x00, 0x02, 0x00, 0x00, 0x00, 0x28, 0x00, 0x00, 0x00,
  0x04, 0x00, 0x00, 0x00, 0x2c, 0xfd, 0xff, 0xff, 0x14, 0x00, 0x00, 0x00,
  0x04, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0xb9, 0x36, 0x0b, 0x3c,
  0x34, 0x00, 0x00, 0x00, 0x73, 0x65, 0x71, 0x75, 0x65, 0x6e, 0x74, 0x69,
  0x61, 0x6c, 0x5f, 0x31, 0x2f, 0x64, 0x65, 0x6e, 0x73, 0x65, 0x5f, 0x34,
  0x2f, 0x4d, 0x61, 0x74, 0x4d, 0x75, 0x6c, 0x2f, 0x52, 0x65, 0x61, 0x64,
  0x56, 0x61, 0x72, 0x69, 0x61, 0x62, 0x6c, 0x65, 0x4f, 0x70, 0x2f, 0x74,
  0x72, 0x61, 0x6e, 0x73, 0x70, 0x6f, 0x73, 0x65, 0x00, 0x00, 0x00, 0x00,
  0x02, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0x10, 0x00, 0x00, 0x00,
  0xaa, 0xfc, 0xff, 0xff, 0x00, 0x00, 0x00, 0x09, 0x6c, 0x00, 0x00, 0x00,
  0x09, 0x00, 0x00, 0x00, 0x44, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00,
  0x9c, 0xfc, 0xff, 0xff, 0x30, 0x00, 0x00, 0x00, 0x24, 0x00, 0x00, 0x00,
  0x18, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x80, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x00, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0xaa, 0x7b, 0xbe, 0x3b, 0x01, 0x00, 0x00, 0x00,
  0x2e, 0xbd, 0xbd, 0x3f, 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
  0x19, 0x00, 0x00, 0x00, 0x73, 0x65, 0x71, 0x75, 0x65, 0x6e, 0x74, 0x69,
  0x61, 0x6c, 0x5f, 0x31, 0x2f, 0x64, 0x65, 0x6e, 0x73, 0x65, 0x5f, 0x33,
  0x2f, 0x52, 0x65, 0x6c, 0x75, 0x00, 0x00, 0x00, 0x02, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0x10, 0x00, 0x00, 0x00, 0x2a, 0xfd, 0xff, 0xff,
  0x00, 0x00, 0x00, 0x02, 0x58, 0x00, 0x00, 0x00, 0x06, 0x00, 0x00, 0x00,
  0x28, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x2c, 0xfe, 0xff, 0xff,
  0x14, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0xe3, 0x04, 0x20, 0x39, 0x20, 0x00, 0x00, 0x00, 0x73, 0x65, 0x71, 0x75,
  0x65, 0x6e, 0x74, 0x69, 0x61, 0x6c, 0x5f, 0x31, 0x2f, 0x64, 0x65, 0x6e,
  0x73, 0x65, 0x5f, 0x33, 0x2f, 0x4d, 0x61, 0x74, 0x4d, 0x75, 0x6c, 0x5f,
  0x62, 0x69, 0x61, 0x73, 0x00, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x10, 0x00, 0x00, 0x00, 0x92, 0xfd, 0xff, 0xff, 0x00, 0x00, 0x00, 0x09,
  0x6c, 0x00, 0x00, 0x00, 0x03, 0x00, 0x00, 0x00, 0x28, 0x00, 0x00, 0x00,
  0x04, 0x00, 0x00, 0x00, 0x94, 0xfe, 0xff, 0xff, 0x14, 0x00, 0x00, 0x00,
  0x04, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0xe8, 0x76, 0x51, 0x3c,
  0x34, 0x00, 0x00, 0x00, 0x73, 0x65, 0x71, 0x75, 0x65, 0x6e, 0x74, 0x69,
  0x61, 0x6c, 0x5f, 0x31, 0x2f, 0x64, 0x65, 0x6e, 0x73, 0x65, 0x5f, 0x33,
  0x2f, 0x4d, 0x61, 0x74, 0x4d, 0x75, 0x6c, 0x2f, 0x52, 0x65, 0x61, 0x64,
  0x56, 0x61, 0x72, 0x69, 0x61, 0x62, 0x6c, 0x65, 0x4f, 0x70, 0x2f, 0x74,
  0x72, 0x61, 0x6e, 0x73, 0x70, 0x6f, 0x73, 0x65, 0x00, 0x00, 0x00, 0x00,
  0x02, 0x00, 0x00, 0x00, 0x10, 0x00, 0x00, 0x00, 0x10, 0x00, 0x00, 0x00,
  0x12, 0xfe, 0xff, 0xff, 0x00, 0x00, 0x00, 0x09, 0x6c, 0x00, 0x00, 0x00,
  0x07, 0x00, 0x00, 0x00, 0x44, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00,
  0x04, 0xfe, 0xff, 0xff, 0x30, 0x00, 0x00, 0x00, 0x24, 0x00, 0x00, 0x00,
  0x18, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x80, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x00, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0xd2, 0x91, 0x43, 0x3c, 0x01, 0x00, 0x00, 0x00,
  0x40, 0xce, 0x42, 0x40, 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
  0x19, 0x00, 0x00, 0x00, 0x73, 0x65, 0x71, 0x75, 0x65, 0x6e, 0x74, 0x69,
  0x61, 0x6c, 0x5f, 0x31, 0x2f, 0x64, 0x65, 0x6e, 0x73, 0x65, 0x5f, 0x32,
  0x2f, 0x52, 0x65, 0x6c, 0x75, 0x00, 0x00, 0x00, 0x02, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0x10, 0x00, 0x00, 0x00, 0x92, 0xfe, 0xff, 0xff,
  0x00, 0x00, 0x00, 0x02, 0x5c, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00,
  0x2c, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x94, 0xff, 0xff, 0xff,
  0x18, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0x28, 0xb3, 0xd9, 0x38, 0x20, 0x00, 0x00, 0x00,
  0x73, 0x65, 0x71, 0x75, 0x65, 0x6e, 0x74, 0x69, 0x61, 0x6c, 0x5f, 0x31,
  0x2f, 0x64, 0x65, 0x6e, 0x73, 0x65, 0x5f, 0x32, 0x2f, 0x4d, 0x61, 0x74,
  0x4d, 0x75, 0x6c, 0x5f, 0x62, 0x69, 0x61, 0x73, 0x00, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0x10, 0x00, 0x00, 0x00, 0xfe, 0xfe, 0xff, 0xff,
  0x00, 0x00, 0x00, 0x09, 0x78, 0x00, 0x00, 0x00, 0x05, 0x00, 0x00, 0x00,
  0x34, 0x00, 0x00, 0x00, 0x10, 0x00, 0x00, 0x00, 0x0c, 0x00, 0x0c, 0x00,
  0x00, 0x00, 0x00, 0x00, 0x04, 0x00, 0x08, 0x00, 0x0c, 0x00, 0x00, 0x00,
  0x14, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0xd5, 0x6b, 0x8a, 0x3b, 0x34, 0x00, 0x00, 0x00, 0x73, 0x65, 0x71, 0x75,
  0x65, 0x6e, 0x74, 0x69, 0x61, 0x6c, 0x5f, 0x31, 0x2f, 0x64, 0x65, 0x6e,
  0x73, 0x65, 0x5f, 0x32, 0x2f, 0x4d, 0x61, 0x74, 0x4d, 0x75, 0x6c, 0x2f,
  0x52, 0x65, 0x61, 0x64, 0x56, 0x61, 0x72, 0x69, 0x61, 0x62, 0x6c, 0x65,
  0x4f, 0x70, 0x2f, 0x74, 0x72, 0x61, 0x6e, 0x73, 0x70, 0x6f, 0x73, 0x65,
  0x00, 0x00, 0x00, 0x00, 0x02, 0x00, 0x00, 0x00, 0x10, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0x8a, 0xff, 0xff, 0xff, 0x00, 0x00, 0x00, 0x09,
  0x60, 0x00, 0x00, 0x00, 0x08, 0x00, 0x00, 0x00, 0x40, 0x00, 0x00, 0x00,
  0x04, 0x00, 0x00, 0x00, 0x7c, 0xff, 0xff, 0xff, 0x2c, 0x00, 0x00, 0x00,
  0x20, 0x00, 0x00, 0x00, 0x14, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0x80, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
  0x01, 0x00, 0x00, 0x00, 0x5d, 0x4f, 0xc9, 0x3c, 0x01, 0x00, 0x00, 0x00,
  0x0e, 0x86, 0xc8, 0x40, 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
  0x12, 0x00, 0x00, 0x00, 0x64, 0x65, 0x6e, 0x73, 0x65, 0x5f, 0x32, 0x5f,
  0x69, 0x6e, 0x70, 0x75, 0x74, 0x5f, 0x69, 0x6e, 0x74, 0x38, 0x00, 0x00,
  0x02, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x0e, 0x00, 0x18, 0x00, 0x08, 0x00, 0x07, 0x00, 0x0c, 0x00,
  0x10, 0x00, 0x14, 0x00, 0x0e, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x09,
  0x6c, 0x00, 0x00, 0x00, 0x0a, 0x00, 0x00, 0x00, 0x50, 0x00, 0x00, 0x00,
  0x10, 0x00, 0x00, 0x00, 0x0c, 0x00, 0x14, 0x00, 0x04, 0x00, 0x08, 0x00,
  0x0c, 0x00, 0x10, 0x00, 0x0c, 0x00, 0x00, 0x00, 0x30, 0x00, 0x00, 0x00,
  0x24, 0x00, 0x00, 0x00, 0x18, 0x00, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00,
  0x01, 0x00, 0x00, 0x00, 0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
  0x00, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0x1a, 0xde, 0x0a, 0x3c,
  0x01, 0x00, 0x00, 0x00, 0x66, 0x64, 0x87, 0x3f, 0x01, 0x00, 0x00, 0x00,
  0x13, 0x42, 0x8d, 0xbf, 0x0d, 0x00, 0x00, 0x00, 0x49, 0x64, 0x65, 0x6e,
  0x74, 0x69, 0x74, 0x79, 0x5f, 0x69, 0x6e, 0x74, 0x38, 0x00, 0x00, 0x00,
  0x02, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x00,
  0x03, 0x00, 0x00, 0x00, 0x3c, 0x00, 0x00, 0x00, 0x28, 0x00, 0x00, 0x00,
  0x10, 0x00, 0x00, 0x00, 0x00, 0x00, 0x0a, 0x00, 0x0e, 0x00, 0x07, 0x00,
  0x00, 0x00, 0x08, 0x00, 0x0a, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x06,
  0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x06, 0x00, 0x06, 0x00, 0x05, 0x00,
  0x06, 0x00, 0x00, 0x00, 0x00, 0x72, 0x0a, 0x00, 0x0c, 0x00, 0x07, 0x00,
  0x00, 0x00, 0x08, 0x00, 0x0a, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x09,
  0x04, 0x00, 0x00, 0x00
};
unsigned int g_model_len = 2512;